Minecraft's rendering does, indeed, fail epically. It can be written to work much better and would fix most framerate issues. Notch shouldn't be afraid to ask about rendering tips and also should not be afraid to revise his code. Java isn't the problem. In fact, it makes things easier for us if anything. In my opinion this should be a project in the back, hidden until it's ready for testing. There may be a point that a compiled Windows executable will run better, but it'd be a less noticeable increase in framerate.
Rollback Post to RevisionRollBack
Quote from GeekAtLifeMeter »
Spontaneous combustion - one of the many hazards of travel between worlds.
Quote from The_Webster »
Quote from G@ngst@ »
1 ke@p try1ng to bU1ld cr@p, but the rOF KEEPS FA1ING D0WN?
Can someone translate what the guy is saying? I don't understand Roblox talk...
Yeah I have 4GB RAM and I USED to be able to play on Far Fog View with Fast graphics, but now if I do that I get around 7fps. So no go, I have to play on small now to get anything over 15fps.
Well if the game dosnt get dumbed down (Slower updates, not compatible on Mac and Linux, less content) and it actually does help by increasing the FPS or help in other ways then there is no way to say no.
Once minecraft is finished, its true brilliance will be the mods people will create. SMP is and always will be buggy, and unfit for pvp.
Pessimism much? No, SMP will not always be buggy, Mojang just needs to pull its fingers out and repair code that is obviously not working. And to be honest when I managed to actually play SMP smoothly on a good server I found a great experience - trying to survive a night with a couple of friends in a dark cave, with lots of mobs and enemy player gangs trying to kill us is thrilling. Although the implementation does need some working on, the concept is obviously there and has great potential.
Complete rewrite is eventually going to happen, regardless. Maybe not in another language but at least it will be rewritten with performance in mind. It will have to when even the most powerful CPU's out there won't be able to absorb the poor performance of the current algorithms used (currently they are still able to, hence why only people below the 2.4GHz mark or such are having some issues). I'm looking forward to that actually - I'm seriously going to laugh when all of you who were saying "omg stop whining ur computer sux, MC haz system requirements" today are going to complain about bad framerates on your state-of-the-art computers. Mark my words.
The point being that Minecraft should be playable on far distance on any frikkin netbook (yes, even written in Java).
Also, if the code gets ported eventually, it should open up a panel of new improvements such as better graphics (not crysis-like but like with improved physics & lighting), where Java might indeed not cut it.
And here is a little tutorial on Garbage Collection for dummies, to help people understand what memory leaks are: basically one of the differences between C++ and Java (replace C++ by any other compiled language really, it's just for context) is that C++ requires the programmer to manage memory appropriately by allocating and freeing it properly, while Java simply lets the programmer allocate and essentially frees it "later".
This is done by a Garbage Collector, which in its simplest form is a loop which walks through all of the objects allocated by the Java code, decides whether they are actually used or not, and frees them in consequence.
The pros of this method is that the programmer doesn't need to worry about memory management and memory leaks - Java does it!
The cons is that it doesn't do it well at all. The Garbage Collector is only called every X seconds (or when system memory is running low) meaning that within this interval, nothing gets freed, hence why memory in Minecraft grows so quickly. And another con is that when it is called, it slows down the game by quite a bit (although this is mostly negligible).
The biggest problem with the Java method is that if the Garbage Collector isn't called fast enough, the game might actually run out of memory, which, due to how Windows works, causes a sizeable amount of its memory to be dumped to the hard drive. However, your hard drive is immensely slower than your memory, and when this happens, you're pretty much doomed, and the game slows to a crawl, until the Garbage Collector cleans it up (and then it starts again). And if you're wondering, no, the garbage collector doesn't kick in fast enough when the system runs out of memory and can't quickly clean up some space before the system dumps to the hard drive.
This is especially a problem on 32-bit Java, where Java programs can't allocate more than 1 gigabyte of memory (shouldn't it be 2? I can't allocate more than one on my laptop and I have 4 gigs, not sure here).
Personally I prefer the C++ way of doing it, because it is basically better. And managing memory isn't that hard, really. And there's this really magical moment when you fire up your game, look at the task manager and see that memory usage is going up and down smoothly as you play the game, with no memory leaks. Really satisfactory.
So overall if you have enough memory to have a "safety margin" while running Minecraft, you wouldn't notice a significant performance gain from a direct port to another language without changing the code. If you don't, well, you would notice a big difference. A quick and dirty workaround for those of you with this problem is to reduce the interval at which the garbage collector is called (google it).
Garbage collection is pure vodoo. We had a guy at my last gig who's primary job was to optimize the garbage collection for our server farm and when he hit a sweet spot for one of the high profiled applications (a fantasy sports application) they had the profile graph printed on shirts, and passed out to the team. It's no joke.
I think people may also be running into problems with plugin writers as well. Some of these guys are learning java just to mod the game and just don't understand that using objects can be expensive and either don't deallocate them or keep them around longer than necessary. Next thing you know, they are complaining about the game's performance when in fact, the plugin was the bottleneck.
In any case, hearing Notch talking about the iPhone, i have no doubt they are actively porting the game to compiled language so they can get it into the app store. Whether this translates into a native desktop client is hard to say since Notch sounds pretty comfortable in Java and can prototype ideas relatively quickly in it.
I think the server should go to a compiled language. Right now it's an absolute RAM-hog.
Quote from Bacterius »
Personally I prefer the C++ way of doing it, because it is basically better. And managing memory isn't that hard, really. And there's this really magical moment when you fire up your game, look at the task manager and see that memory usage is going up and down smoothly as you play the game, with no memory leaks. Really satisfactory.
I wish Java would allow more manual memory management beyond "call System.gc() and pray". If there's anything beyond that, I haven't heard of it.
I used to be able to play minecraft a few builds ago, but I've since become unable to play due to the intensely low FPS. It kinda sucks - I would appreciate it being recoded now so I could continue enjoying it before its final release :\
Rollback Post to RevisionRollBack
"Any sufficiently advanced technology is indistinguishable from magic"
-Arthur C. Clark
"Any sufficiently rigorously defined magic is indistinguishable from technology"
-Larry Niven
I was somewhat "inspired" to write a blog post recently for reasons unrelated to this thread, but parts of what I wrote are relevant.
For example, regarding C++ and memory management:
As for memory management: once you learn how to do it, and how to work with it, it just get’s in the way. No matter how much practice you get with malloc() and free() you’re still going to have bugs where you miss a malloc or you miss a free or you perform a double-free or something; the only recourse is always to create some sort of automatic tool or use a design pattern to make finding these problems easier. What something like garbage collection does is make such design patterns unnecessary since there is essentially a design pattern for memory management built in.
The point being that Minecraft should be playable on far distance on any frikkin netbook (yes, even written in Java).
Why?
This is done by a Garbage Collector, which in its simplest form is a loop which walks through all of the objects allocated by the Java code, decides whether they are actually used or not, and frees them in consequence.
Absolutely no modern Virtual Machine currently uses that method. The two simplest methods implemented in the very earliest Virtual Machines (Sun's and Microsoft's) used Two Algorithms: Mark and Sweep, and Stop and Copy.
The Sun VM used Mark and Sweep, which (is pretty much as you noted, it goes through the list of objects and finds those that are no longer referenced, "marks" them, and then when it's done goes over it again and deallocates all of them (it does it in two steps because deallocating the objects as it does so could cause issues with invoked finalizers) The MS Virtual Machine used "Stop and Copy" which was faster, but cost a bit more memory. Instead of marking objects that weren't used, the MS VM copied all objects that were still present to a new piece of memory, and then basically said "ok, this is the new memory block" and discarded all of the memory it was using. That doesn't sound right, since it sounds like it should be slower, I probably didn't describe it properly. Oh well.
In any case, all Modern Java VM's use a Garbage Collector implemented using a Generational Approach more java relevant stuff here, all I know about generational GC's is from .NET, which probably isn't the same:http://java.sun.com/docs/hotspot/gc1.4.2/faq.html
The biggest problem with the Java method is that if the Garbage Collector isn't called fast enough, the game might actually run out of memory, which, due to how Windows works, causes a sizeable amount of its memory to be dumped to the hard drive
Wrong. Only inactive pages are paged out to disk. Other applications might be page-faulted back into physical memory when you switch to them, but Minecraft's private working set will always "take priority" as long as it is "hitting" it's memory frequently enough. As a foreground application however it basically means that it's entire working set will be kept "at the ready" in physical memory- I mean, it is the active application.
(replace C++ by any other compiled language really, it's just for context)
Visual Basic 6 compiles to native code. IT doesn't require you to perform allocations and deallocations manually. Delphi Compiles to native code. It doesn't require it either.
and
they are actively porting the game to compiled language
Java is a compiled language. So I guess their work is done then. You both obviously mean a language that has a compiler that can compile to machine code, well, there are native code compilers for Java, but the thing is, that defeats the entire purpose of the language, which is to be portable. If you want to tie yourself down to a certain architecture or maintain some retarded setup for compiling to different architectures, that is your perogative, but to come to some half-baked solution that compiling to native code is better is simply wrong. For one thing, it limits your options. If you compile a program with certain optimizations, that's it. you'll have to recompile to change it. With java- the compiled bytecode is compiled to the machine code of the native environment, with optimizations based on the configuration, such as availability of processor extensions, x64, etc.
And there's this really magical moment when you fire up your game, look at the task manager and see that memory usage is going up and down smoothly as you play the game, with no memory leaks. Really satisfactory.
Um... I get that with my C# game. Not sure why it's magical, unless, despite what you just stated, manual memory management is hard to do right?
So overall if you have enough memory to have a "safety margin" while running Minecraft, you wouldn't notice a significant performance gain from a direct port to another language without changing the code. If you don't, well, you would notice a big difference. A quick and dirty workaround for those of you with this problem is to reduce the interval at which the garbage collector is called (google it).
I believe your information on the garbage collector was based on current information in 1996.
Java is a compiled language. So I guess their work is done then. You both obviously mean a language that has a compiler that can compile to machine code, well, there are native code compilers for Java, but the thing is, that defeats the entire purpose of the language, which is to be portable.
Yes, I meant machine code as opposed to bytecode. But if they are going to port this to iOS devices (which Notch has expressed interest in doing), they will HAVE to port it to another language as Java is not allowed to run on iOS devices per Apple's TOS as I'm sure you know. Hence, my assertion that they are actively porting it. This isn't rocket science here.
You now it wont be so bad if we got a map with a end! I dont care for taht infite maps! Also iamgine snow and rain in the new NEW UPDATE 1.5 iw will kill Minecraft! Someone will convert Minecraft for sure if notch doesn't do that! If we ever get a full game and notch releases the code someone will repair to to c++ or something else!
Because it does not require even close to the amount of power required by even mid-range games such as Half-Life 2. Say what you want but it is inferior in every way in terms of computational power - less vertices, less textures, etc ... and even if I turn out to be wrong and you still need a somewhat decent processor for MC, what I'm really saying is that TINY VIEW DISTANCE SHOULD NOT SLOW DOWN MORE THAN FAR DISTANCE. Optimisation is the key. Go have a look at the MC source code, come back and tell me with a straight eye that it is running as best as it can (putting Java aside).
Absolutely no modern Virtual Machine currently uses that method. The two simplest methods implemented in the very earliest Virtual Machines (Sun's and Microsoft's) used Two Algorithms: Mark and Sweep, and Stop and Copy.
The Sun VM used Mark and Sweep, which (is pretty much as you noted, it goes through the list of objects and finds those that are no longer referenced, "marks" them, and then when it's done goes over it again and deallocates all of them (it does it in two steps because deallocating the objects as it does so could cause issues with invoked finalizers) The MS Virtual Machine used "Stop and Copy" which was faster, but cost a bit more memory. Instead of marking objects that weren't used, the MS VM copied all objects that were still present to a new piece of memory, and then basically said "ok, this is the new memory block" and discarded all of the memory it was using. That doesn't sound right, since it sounds like it should be slower, I probably didn't describe it properly. Oh well.
I said, in its simplest form. It is obvious modern VM's have improved since then.
Wrong. Only inactive pages are paged out to disk. Other applications might be page-faulted back into physical memory when you switch to them, but Minecraft's private working set will always "take priority" as long as it is "hitting" it's memory frequently enough. As a foreground application however it basically means that it's entire working set will be kept "at the ready" in physical memory- I mean, it is the active application.
Fine on that.
(replace C++ by any other compiled language really, it's just for context)
Visual Basic 6 compiles to native code. IT doesn't require you to perform allocations and deallocations manually. Delphi Compiles to native code. It doesn't require it either.
and
In these languages, the difference is that you can actually use pointers. You need to handle those. And you NEED to use those in a game. Conclusion?
Um... I get that with my C# game. Not sure why it's magical, unless, despite what you just stated, manual memory management is hard to do right?
Compared to a Java game with the same code, the only thing I was thinking of is "wtf is Java doing". And that was quite a while ago. Been there, done that.
I believe your information on the garbage collector was based on current information in 1996.
Fair enough.
---
You make some good points and corrected my post, I stand corrected on the GC. But the issue remains - Minecraft is poorly optimized and while it isn't Java's fault (it really isn't), it's impossible to deny the fact that it could, and should, run much faster on current computers.
Because it does not require even close to the amount of power required by even mid-range games such as Half-Life 2.
I'm fairly certain that Minecraft actually has more visible triangles then half-life 2 in your "average" scene. It sounds counterintuitive, but really the blocky nature of minecraft adds more triangles, not fewer; the models and built scenery and whatnot in a game like Half-life 2 are individually tweaked to get the best look for as few poly's as possible; minecraft's dynamic world generation makes any sort of "optimizations" in that way far more difficult and only algorithmically feasible; a good example, would be sort of like how on most machines, if you have a font file but that font doesn't have a bold and/or italic styling, the computer will algorithmically make it bold, italic, etc by applying a predefined set of rules; but a bold/italic font file for that family will almost always look better, because it's designed. There are in fact a few optimizations to my knowledge in minecraft, such as larger squares of wall and land being represented using the minimum number of triangles (a 128x128 square of the same texture on the ground will only use two triangles, for example).
Say what you want but it is inferior in every way in terms of computational power - less vertices, less textures, etc
In addition to what I noted above, Minecraft doesn't have as many options open for how the vertices represent the on-screen poly's. For example, in a game like half-life 2, certain features might use a triangle fan (which has a singel vertice followed by N vertices which creates N-1 triangles, (so having 9 points gives you 8 triangles, which saves on the 24 vertices that would be required for a more conventional Triangle List. Basically the whole "randomly generated terrain" is what causes some of the issues, because most other games it is compared to performance wise (such as, in your example, HL2) were indescribably optimized resource wise; basically, my point here is that there seems to be credit being given to the language and compile method that games like HL2 use (C++) for the better performance when it may very well be a factor of optimal resource construction (models, textures, terrain, etc).
what I'm really saying is that TINY VIEW DISTANCE SHOULD NOT SLOW DOWN MORE THAN FAR DISTANCE.
I didn't even know that was the case, seems that lower view distances aren't actually culling closer to the viewer. That's just... weird.
Go have a look at the MC source code
Minecraft is closed source; sure, it is possible to decompile it and deobfuscate it, but it is foolish to judge the competency of any segment of code that is nothing more then a reconstructed ghost of what it really was; especially since a decompiled and deobfuscated version would be devoid of any sort of comments, variable naming, etc. Although that's more my reasons for not messing around with it, but even after deobfuscation, the code itself is still obfuscated (if that makes any sense).
come back and tell me with a straight eye that it is running as best as it can
of course it isn't. Everything can be improved. That fog issues is just weird. Maybe it's actually some stupid thing with LWGL.
I said, in its simplest form. It is obvious modern VM's have improved since then.
Well, yeah, but it sorta seemed like you were arguing against the ghost of what it was.
(replace C++ by any other compiled language really, it's just for context)
Visual Basic 6 compiles to native code. IT doesn't require you to perform allocations and deallocations manually. Delphi Compiles to native code. It doesn't require it either.
and
the difference is that you can actually use pointers.
Visual Basic 6 doesn't let you use pointers. In fact the best equivalent is to use rtlMoveMemory and a few hidden VBA routines to copy data to and from pointers. The closest native thing in VB6 to a pointer would be references (Delphi has Pointers, they aren't required but they are there). References are of course a type of pointer, but Java uses references for bloody well everything.
You need to handle those. And you NEED to use those in a game.
Wrong, you don't <need> to use pointers in a game. Also, Java uses pointers in the form of references. Thus NullPointerException. (Although I prefer the slightly more accurate .NET NullReferenceException)
Compared to a Java game with the same code, the only thing I was thinking of is "wtf is Java doing". And that was quite a while ago. Been there, done that.
It's job; I admittedly didn't read the entire thing but one of the links I posted in my previous post sort of explained that the design goals, use cases, and the language design for Java was a totally different direction from C++. Now, this may faciliate arguments to the chord of "well, clearly that direction doesn't facilitate game design", but really, it does. In fact, in many ways, it's better. Memory allocation is very fast as long as nothing needs to be paged out to make room, and the Garbage collector of Java intercepts various "System resources low" messages (on windows that is the WM_COMPACTING message) and will discard older generations of GC'd objects. The main problem isn't so much with the GC but with the design of classes that use it; many people design classes that are immutable; the java.String class being a prime example, where string objects are constantly becoming dead; just this segment here(which may not compile, haven't used java in a while, YMMV, etc):
when the procedure finishes, there are now three new string objects that need to be deallocated "hello" "world" and "helloworld"; thing is, rewriting the procedure as such:
Will only add a string object and a Stringbuilder; scant savings, but the problem I'm demonstrating is that the construction and discarding of immutables in deep loops is the problem, not the garbage collector; with C and C++, because you are usually in control of the allocation and deallocation, immutables are no problem, just free it and set the value to null. The problem is that people try to do a similar design pattern in java and assume that it will work as well; essentially the garbage collector implies a necessary rethinking about memory allocation.
But the issue remains - Minecraft is poorly optimized and while it isn't Java's fault (it really isn't), it's impossible to deny the fact that it could, and should, run much faster on current computers.
Certainly no issue with that. And, to be fair, I really do not like java, but for a number of otherwise irrelevant reasons to what is being discussed here, (specifically, it worries me that it took so long for them to get Event handling working in a sensible manner, or that they took years to add lambda's/closures because they claimed "functional programming is a fad" which is ironic since the tagline against java in the mid nineties was that "Object oriented programming is a fad".
The basic point is, optimizing minecraft is better achieved by the very methods outlined; better use of available functionality and (in the case of fog) not writing invisible triangles to the pipeline. Thing is, aside from that, most of the more common-sense optimizations have been made; a square wall of blocks of the same type is a single texture, for example; and there is backface culling, so if anything more recent versions of MC are faster then the old (although I cannot remember when backface culling was added). I don't know it it uses view frustrum culling or any of the other methods of reducing triangles, but if not, they should be working on adding that; porting it to another language takes time for no gain and will only serve to add bugs.
Which brings me to another point; there is a common mindset- and it's a feeling programmers get as well, that at some point in development, a program feels so messy and unmaintainable that it "is better to rewrite it" thing is, that's never the case. The reason a program feels messy and unmaintainable is almost always because of those changes made during development to fix bugs and other errata; the single biggest mistake any software firm can do with their flagship product is rewrite it from scratch; and that is what will essentially occur if they are to "port" it to C++. It's happened before; The reason netscape failed was because they tried to rewrite it. Borland bought Arago and tried to turn it into Dbase for windows as opposed to forking their current dbase code) which utterly failed because Access ate them for lunch. Microsoft almost made the same mistake, trying to rewrite Word for Windows from scratch in a doomed project called Pyramid which was shut down, thrown away, and swept under the rug.
Programmers are, in their hearts, architects, and the first thing they want to do when they get to a site is to bulldoze the place flat and build something grand. We're not excited by incremental renovation: tinkering, improving, planting flower beds. BORING.
There's a subtle reason that programmers always want to throw away the code and start over. The reason is that they think the old code is a mess. And here is the interesting observation: they are probably wrong. The reason that they think the old code is a mess is because of a cardinal, fundamental law of programming:
It’s harder to read code than to write it.
This is why code reuse is so hard. This is why everybody on your team has a different function they like to use for splitting strings into arrays of strings. They write their own function because it's easier and more fun than figuring out how the old function works.
As a corollary of this axiom, you can ask almost any programmer today about the code they are working on. "It's a big hairy mess," they will tell you. "I'd like nothing better than to throw it out and start over."
Why is it a mess?
"Well," they say, "look at this function. It is two pages long! None of this stuff belongs in there! I don't know what half of these API calls are for."
Before Borland's new spreadsheet for Windows shipped, Philippe Kahn, the colorful founder of Borland, was quoted a lot in the press bragging about how Quattro Pro would be much better than Microsoft Excel, because it was written from scratch. All new source code! As if source code rusted.
The idea that new code is better than old is patently absurd. Old code has been used. It has been tested. Lots of bugs have been found, and they've been fixed. There's nothing wrong with it. It doesn't acquire bugs just by sitting around on your hard drive. Au contraire, baby! Is software supposed to be like an old Dodge Dart, that rusts just sitting in the garage? Is software like a teddy bear that's kind of gross if it's not made out of all new material?
Back to that two page function. Yes, I know, it's just a simple function to display a window, but it has grown little hairs and stuff on it and nobody knows why. Well, I'll tell you why: those are bug fixes. One of them fixes that bug that Nancy had when she tried to install the thing on a computer that didn't have Internet Explorer. Another one fixes that bug that occurs in low memory conditions. Another one fixes that bug that occurred when the file is on a floppy disk and the user yanks out the disk in the middle. That LoadLibrary call is ugly but it makes the code work on old versions of Solaris.
Each of these bugs took weeks of real-world usage before they were found. The programmer might have spent a couple of days reproducing the bug in the lab and fixing it. If it's like a lot of bugs, the fix might be one line of code, or it might even be a couple of characters, but a lot of work and time went into those two characters.
When you throw away code and start from scratch, you are throwing away all that knowledge. All those collected bug fixes. Years of programming work.
You are throwing away your market leadership. You are giving a gift of two or three years to your competitors, and believe me, that is a long time in software years.
With minecraft, what would happen if they rewrote minecraft in C++, is this:
All updates to Minecraft would stop. After all, there is no point maintaining the old sloppy java version. it's garbage now, the new C++ version will shine! So no new updates. You want basic weather? we aren't going back to that old icky java code, we're rewriting it in C++! they'll say. Days turn to weeks turn to months turn to years.
meanwhile, Manic Digger will likely have a revival as people miffed by being abandoned by Mojang rally behind it; new developers will strike into the fold, And it will become rampant with development. Each passing day, more people will "lose hope" that there will even be a C++ version of minecraft, as Manic Digger (or some other project) starts gaining more and more traction.
And what happens when Minecraft C++ edition is released? The best case scenario for Mojang is that it works exactly like the java version did, only slightly faster and with some new set of bugs. There won't have been any new content added, they were busy just duplicating what they already had. Meanwhile, Manic Digger (or again, whatever equivalent rises) will have passed what Minecraft was by leaps and bounds, very well making minecraft look to it as infiniminer does to minecraft. Now this isn't a bad thing at all; we'll still have "a minecraft" of sorts, and it will be open source and probably more easily modded. But I'm looking at this from Mojang's perspective.
BC_Programming, I had to register an account just to tell you how awesome that was. Rare is the forum post that is argued so coherently, clearly, and above all, civilly. I thank you for taking the time to write it.
God I love threads like this. A discussion between smart programmers always fills me with happy fuzzies cause as a n00b programmer, I learn a little bit.
I'm fairly certain that Minecraft actually has more visible triangles then half-life 2 in your "average" scene. It sounds counterintuitive, but really the blocky nature of minecraft adds more triangles, not fewer; the models and built scenery and whatnot in a game like Half-life 2 are individually tweaked to get the best look for as few poly's as possible; minecraft's dynamic world generation makes any sort of "optimizations" in that way far more difficult and only algorithmically feasible; a good example, would be sort of like how on most machines, if you have a font file but that font doesn't have a bold and/or italic styling, the computer will algorithmically make it bold, italic, etc by applying a predefined set of rules; but a bold/italic font file for that family will almost always look better, because it's designed. There are in fact a few optimizations to my knowledge in minecraft, such as larger squares of wall and land being represented using the minimum number of triangles (a 128x128 square of the same texture on the ground will only use two triangles, for example).
Indeed. It really depends, if you're on top of a mountain on far distance you'll have quite a few triangles to draw before hitting the far-culling. And HL2 wasn't the best example as well, the source engine is quite efficient.
I didn't even know that was the case, seems that lower view distances aren't actually culling closer to the viewer. That's just... weird.
It is, indeed. For some reason, in SSP the fog system seems to cull appropriately but in SMP at it is right now, anything below far view distance just kills the performance.
Minecraft is closed source; sure, it is possible to decompile it and deobfuscate it, but it is foolish to judge the competency of any segment of code that is nothing more then a reconstructed ghost of what it really was; especially since a decompiled and deobfuscated version would be devoid of any sort of comments, variable naming, etc. Although that's more my reasons for not messing around with it, but even after deobfuscation, the code itself is still obfuscated (if that makes any sense).
Indeed, although you can still kind of see in which direction the code is going, even when it is obfuscated. Still, I don't like that closed source philosophy as well - mods have existed for as long as Minecraft has, Mojang might just as well release the code and make everyone's life easier (right?). Note that with another programming language, mods would still be doable if the code was released along with the game, but it would attract a totally different community of modders.
of course it isn't. Everything can be improved. That fog issues is just weird. Maybe it's actually some stupid thing with LWGL.
Maybe, although that would be strange coming from such a big library. It probably is just some bad code in the networking engine (with regard to the SMP issues above), maybe a typo?
Well, yeah, but it sorta seemed like you were arguing against the ghost of what it was.
Yeah I got a bit carried away with this particular model. My bad.
Wrong, you don't <need> to use pointers in a game. Also, Java uses pointers in the form of references. Thus NullPointerException. (Although I prefer the slightly more accurate .NET NullReferenceException)
Yes, but I meant actually using them to make the game run faster. You can argue that anything is a pointer, since basically a variable such as a longword (in Delphi for instance) is just a pointer with one level of indirection. A method within a class is a pointer with two, sometimes more (if the method is virtual) levels of indirection.
It's job; I admittedly didn't read the entire thing but one of the links I posted in my previous post sort of explained that the design goals, use cases, and the language design for Java was a totally different direction from C++. Now, this may faciliate arguments to the chord of "well, clearly that direction doesn't facilitate game design", but really, it does. In fact, in many ways, it's better. Memory allocation is very fast as long as nothing needs to be paged out to make room, and the Garbage collector of Java intercepts various "System resources low" messages (on windows that is the WM_COMPACTING message) and will discard older generations of GC'd objects. The main problem isn't so much with the GC but with the design of classes that use it; many people design classes that are immutable; the java.String class being a prime example, where string objects are constantly becoming dead; just this segment here(which may not compile, haven't used java in a while, YMMV, etc):
Well, it depends what you are aiming for. If you are just hacking up a game (like Minecraft in its current state), you're just going to be allocating when you need and let the GC do its stuff. But if I'm coding something that needs to work as fast as possible, you better bet I'll ensure every GetMem or VirtualAlloc or whatever is paired with one, and only one, FreeMem or VirtualFree or whatever. No-one likes having an out of memory exception when even the pagefile can't take it anymore.
But I agree - in many ways it simplifies the game development process, and helps game developers focus on the game and not on some crappy arcane object that just won't free itself properly. But IMO it's just not a sustainable development model, and code needs to be reviewed and cleaned up every once in a while. With Minecraft I just get the nasty feeling the development team just kept adding more and more stuff on the base code until it became basically impossible to maintain and optimize - in this case there are two paths of action:
- "perhaps we should rewrite this"
- "it's working anyway, the hell with it"
Which brings us to your next point..
Which brings me to another point; there is a common mindset- and it's a feeling programmers get as well, that at some point in development, a program feels so messy and unmaintainable that it "is better to rewrite it" thing is, that's never the case. The reason a program feels messy and unmaintainable is almost always because of those changes made during development to fix bugs and other errata; the single biggest mistake any software firm can do with their flagship product is rewrite it from scratch; and that is what will essentially occur if they are to "port" it to C++. It's happened before; The reason netscape failed was because they tried to rewrite it. Borland bought Arago and tried to turn it into Dbase for windows as opposed to forking their current dbase code) which utterly failed because Access ate them for lunch. Microsoft almost made the same mistake, trying to rewrite Word for Windows from scratch in a doomed project called Pyramid which was shut down, thrown away, and swept under the rug.
Rewriting from scratch is not an option once the project gets too big - if they are going to rewrite it, they are going to focus on optimization, and as someone (forgot the name) put it well, "premature optimization is the root of all evil" and they will produce an even bigger mess. And also for the reasons you outline above - complete rewrite is not really the best idea, yet I think some portions of code would benefit from a rewrite while others are best left alone.
The basic point is, optimizing minecraft is better achieved by the very methods outlined; better use of available functionality and (in the case of fog) not writing invisible triangles to the pipeline. Thing is, aside from that, most of the more common-sense optimizations have been made; a square wall of blocks of the same type is a single texture, for example; and there is backface culling, so if anything more recent versions of MC are faster then the old (although I cannot remember when backface culling was added). I don't know it it uses view frustrum culling or any of the other methods of reducing triangles, but if not, they should be working on adding that; porting it to another language takes time for no gain and will only serve to add bugs.
Actually I think Minecraft handles the rendering part very well - at least on my systems. The real problem is not there, it's in the chunk update code, which takes too much processor time - and since Minecraft is fully singlethreaded, rendering is severely affected once this happens. This is where parallelization of the chunk update process comes in handy - but it requires rewriting a fair bit of the code, which may not be an option. Yet it needs to be done eventually, as CPU's are hitting thermal limits right now, while the Minecraft rendering distance may increase in the future i.e. more (and bigger) chunk updates. We have a problem.
And what happens when Minecraft C++ edition is released? The best case scenario for Mojang is that it works exactly like the java version did, only slightly faster and with some new set of bugs. There won't have been any new content added, they were busy just duplicating what they already had. Meanwhile, Manic Digger (or again, whatever equivalent rises) will have passed what Minecraft was by leaps and bounds, very well making minecraft look to it as infiniminer does to minecraft. Now this isn't a bad thing at all; we'll still have "a minecraft" of sorts, and it will be open source and probably more easily modded. But I'm looking at this from Mojang's perspective.
I think we both agree on a point: complete rewrite is obviously not the way to go and will cause Minecraft to fail as proven many times in the past. But it's not binary, it's not either complete rewrite either nothing - it is also possible to have a look in the code, profile it, see where the biggest bottleneck is, and try and figure out how to get those fifty lines of code or so working faster (I'm thinking of the chunk update code here). Obviously any programmer worth his salt knows that when a program is running slowly, it's not because of the whole code - in any given program there's probably only 5% of the *total* code that is actually slowing down the whole of it. AFAIK the only complaints I have ever had and seen for Minecraft concern the chunk update & the SMP networking code. This is all that, apparently, needs to be looked over. And a couple of bugs such as the interface being unresponsive when going from fullscreen to windowed mode. All the rest works well and should be left alone.
Chumhandle: logicalDesigner
Pessimism much? No, SMP will not always be buggy, Mojang just needs to pull its fingers out and repair code that is obviously not working. And to be honest when I managed to actually play SMP smoothly on a good server I found a great experience - trying to survive a night with a couple of friends in a dark cave, with lots of mobs and enemy player gangs trying to kill us is thrilling. Although the implementation does need some working on, the concept is obviously there and has great potential.
The point being that Minecraft should be playable on far distance on any frikkin netbook (yes, even written in Java).
Also, if the code gets ported eventually, it should open up a panel of new improvements such as better graphics (not crysis-like but like with improved physics & lighting), where Java might indeed not cut it.
And here is a little tutorial on Garbage Collection for dummies, to help people understand what memory leaks are: basically one of the differences between C++ and Java (replace C++ by any other compiled language really, it's just for context) is that C++ requires the programmer to manage memory appropriately by allocating and freeing it properly, while Java simply lets the programmer allocate and essentially frees it "later".
This is done by a Garbage Collector, which in its simplest form is a loop which walks through all of the objects allocated by the Java code, decides whether they are actually used or not, and frees them in consequence.
The pros of this method is that the programmer doesn't need to worry about memory management and memory leaks - Java does it!
The cons is that it doesn't do it well at all. The Garbage Collector is only called every X seconds (or when system memory is running low) meaning that within this interval, nothing gets freed, hence why memory in Minecraft grows so quickly. And another con is that when it is called, it slows down the game by quite a bit (although this is mostly negligible).
The biggest problem with the Java method is that if the Garbage Collector isn't called fast enough, the game might actually run out of memory, which, due to how Windows works, causes a sizeable amount of its memory to be dumped to the hard drive. However, your hard drive is immensely slower than your memory, and when this happens, you're pretty much doomed, and the game slows to a crawl, until the Garbage Collector cleans it up (and then it starts again). And if you're wondering, no, the garbage collector doesn't kick in fast enough when the system runs out of memory and can't quickly clean up some space before the system dumps to the hard drive.
This is especially a problem on 32-bit Java, where Java programs can't allocate more than 1 gigabyte of memory (shouldn't it be 2? I can't allocate more than one on my laptop and I have 4 gigs, not sure here).
Personally I prefer the C++ way of doing it, because it is basically better. And managing memory isn't that hard, really. And there's this really magical moment when you fire up your game, look at the task manager and see that memory usage is going up and down smoothly as you play the game, with no memory leaks. Really satisfactory.
So overall if you have enough memory to have a "safety margin" while running Minecraft, you wouldn't notice a significant performance gain from a direct port to another language without changing the code. If you don't, well, you would notice a big difference. A quick and dirty workaround for those of you with this problem is to reduce the interval at which the garbage collector is called (google it).
I think people may also be running into problems with plugin writers as well. Some of these guys are learning java just to mod the game and just don't understand that using objects can be expensive and either don't deallocate them or keep them around longer than necessary. Next thing you know, they are complaining about the game's performance when in fact, the plugin was the bottleneck.
In any case, hearing Notch talking about the iPhone, i have no doubt they are actively porting the game to compiled language so they can get it into the app store. Whether this translates into a native desktop client is hard to say since Notch sounds pretty comfortable in Java and can prototype ideas relatively quickly in it.
I wish Java would allow more manual memory management beyond "call System.gc() and pray". If there's anything beyond that, I haven't heard of it.
-Arthur C. Clark
"Any sufficiently rigorously defined magic is indistinguishable from technology"
-Larry Niven
For example, regarding C++ and memory management:
Why?
Absolutely no modern Virtual Machine currently uses that method. The two simplest methods implemented in the very earliest Virtual Machines (Sun's and Microsoft's) used Two Algorithms: Mark and Sweep, and Stop and Copy.
The Sun VM used Mark and Sweep, which (is pretty much as you noted, it goes through the list of objects and finds those that are no longer referenced, "marks" them, and then when it's done goes over it again and deallocates all of them (it does it in two steps because deallocating the objects as it does so could cause issues with invoked finalizers) The MS Virtual Machine used "Stop and Copy" which was faster, but cost a bit more memory. Instead of marking objects that weren't used, the MS VM copied all objects that were still present to a new piece of memory, and then basically said "ok, this is the new memory block" and discarded all of the memory it was using. That doesn't sound right, since it sounds like it should be slower, I probably didn't describe it properly. Oh well.
In any case, all Modern Java VM's use a Garbage Collector implemented using a Generational Approach more java relevant stuff here, all I know about generational GC's is from .NET, which probably isn't the same:http://java.sun.com/docs/hotspot/gc1.4.2/faq.html
another nice article: http://www.javacoffeebreak.com/articles ... ction.html
Wrong. Only inactive pages are paged out to disk. Other applications might be page-faulted back into physical memory when you switch to them, but Minecraft's private working set will always "take priority" as long as it is "hitting" it's memory frequently enough. As a foreground application however it basically means that it's entire working set will be kept "at the ready" in physical memory- I mean, it is the active application.
Visual Basic 6 compiles to native code. IT doesn't require you to perform allocations and deallocations manually. Delphi Compiles to native code. It doesn't require it either.
and
Java is a compiled language. So I guess their work is done then. You both obviously mean a language that has a compiler that can compile to machine code, well, there are native code compilers for Java, but the thing is, that defeats the entire purpose of the language, which is to be portable. If you want to tie yourself down to a certain architecture or maintain some retarded setup for compiling to different architectures, that is your perogative, but to come to some half-baked solution that compiling to native code is better is simply wrong. For one thing, it limits your options. If you compile a program with certain optimizations, that's it. you'll have to recompile to change it. With java- the compiled bytecode is compiled to the machine code of the native environment, with optimizations based on the configuration, such as availability of processor extensions, x64, etc.
Um... I get that with my C# game. Not sure why it's magical, unless, despite what you just stated, manual memory management is hard to do right?
I believe your information on the garbage collector was based on current information in 1996.
You didn't read the "Java is fine" option did you?
Ignorance is quite prevalent in these parts.
Use for different variety of food
Now check out others suggestions
Climbing your @$$ off in minecraft
Yes, I meant machine code as opposed to bytecode. But if they are going to port this to iOS devices (which Notch has expressed interest in doing), they will HAVE to port it to another language as Java is not allowed to run on iOS devices per Apple's TOS as I'm sure you know. Hence, my assertion that they are actively porting it. This isn't rocket science here.
Because it does not require even close to the amount of power required by even mid-range games such as Half-Life 2. Say what you want but it is inferior in every way in terms of computational power - less vertices, less textures, etc ... and even if I turn out to be wrong and you still need a somewhat decent processor for MC, what I'm really saying is that TINY VIEW DISTANCE SHOULD NOT SLOW DOWN MORE THAN FAR DISTANCE. Optimisation is the key. Go have a look at the MC source code, come back and tell me with a straight eye that it is running as best as it can (putting Java aside).
I said, in its simplest form. It is obvious modern VM's have improved since then.
Fine on that.
In these languages, the difference is that you can actually use pointers. You need to handle those. And you NEED to use those in a game. Conclusion?
Compared to a Java game with the same code, the only thing I was thinking of is "wtf is Java doing". And that was quite a while ago. Been there, done that.
Fair enough.
---
You make some good points and corrected my post, I stand corrected on the GC. But the issue remains - Minecraft is poorly optimized and while it isn't Java's fault (it really isn't), it's impossible to deny the fact that it could, and should, run much faster on current computers.
I'm fairly certain that Minecraft actually has more visible triangles then half-life 2 in your "average" scene. It sounds counterintuitive, but really the blocky nature of minecraft adds more triangles, not fewer; the models and built scenery and whatnot in a game like Half-life 2 are individually tweaked to get the best look for as few poly's as possible; minecraft's dynamic world generation makes any sort of "optimizations" in that way far more difficult and only algorithmically feasible; a good example, would be sort of like how on most machines, if you have a font file but that font doesn't have a bold and/or italic styling, the computer will algorithmically make it bold, italic, etc by applying a predefined set of rules; but a bold/italic font file for that family will almost always look better, because it's designed. There are in fact a few optimizations to my knowledge in minecraft, such as larger squares of wall and land being represented using the minimum number of triangles (a 128x128 square of the same texture on the ground will only use two triangles, for example).
In addition to what I noted above, Minecraft doesn't have as many options open for how the vertices represent the on-screen poly's. For example, in a game like half-life 2, certain features might use a triangle fan (which has a singel vertice followed by N vertices which creates N-1 triangles, (so having 9 points gives you 8 triangles, which saves on the 24 vertices that would be required for a more conventional Triangle List. Basically the whole "randomly generated terrain" is what causes some of the issues, because most other games it is compared to performance wise (such as, in your example, HL2) were indescribably optimized resource wise; basically, my point here is that there seems to be credit being given to the language and compile method that games like HL2 use (C++) for the better performance when it may very well be a factor of optimal resource construction (models, textures, terrain, etc).
I didn't even know that was the case, seems that lower view distances aren't actually culling closer to the viewer. That's just... weird.
Minecraft is closed source; sure, it is possible to decompile it and deobfuscate it, but it is foolish to judge the competency of any segment of code that is nothing more then a reconstructed ghost of what it really was; especially since a decompiled and deobfuscated version would be devoid of any sort of comments, variable naming, etc. Although that's more my reasons for not messing around with it, but even after deobfuscation, the code itself is still obfuscated (if that makes any sense).
of course it isn't. Everything can be improved. That fog issues is just weird. Maybe it's actually some stupid thing with LWGL.
Well, yeah, but it sorta seemed like you were arguing against the ghost of what it was.
Visual Basic 6 doesn't let you use pointers. In fact the best equivalent is to use rtlMoveMemory and a few hidden VBA routines to copy data to and from pointers. The closest native thing in VB6 to a pointer would be references (Delphi has Pointers, they aren't required but they are there). References are of course a type of pointer, but Java uses references for bloody well everything.
Wrong, you don't <need> to use pointers in a game. Also, Java uses pointers in the form of references. Thus NullPointerException. (Although I prefer the slightly more accurate .NET NullReferenceException)
It's job; I admittedly didn't read the entire thing but one of the links I posted in my previous post sort of explained that the design goals, use cases, and the language design for Java was a totally different direction from C++. Now, this may faciliate arguments to the chord of "well, clearly that direction doesn't facilitate game design", but really, it does. In fact, in many ways, it's better. Memory allocation is very fast as long as nothing needs to be paged out to make room, and the Garbage collector of Java intercepts various "System resources low" messages (on windows that is the WM_COMPACTING message) and will discard older generations of GC'd objects. The main problem isn't so much with the GC but with the design of classes that use it; many people design classes that are immutable; the java.String class being a prime example, where string objects are constantly becoming dead; just this segment here(which may not compile, haven't used java in a while, YMMV, etc):
when the procedure finishes, there are now three new string objects that need to be deallocated "hello" "world" and "helloworld"; thing is, rewriting the procedure as such:
Will only add a string object and a Stringbuilder; scant savings, but the problem I'm demonstrating is that the construction and discarding of immutables in deep loops is the problem, not the garbage collector; with C and C++, because you are usually in control of the allocation and deallocation, immutables are no problem, just free it and set the value to null. The problem is that people try to do a similar design pattern in java and assume that it will work as well; essentially the garbage collector implies a necessary rethinking about memory allocation.
Certainly no issue with that. And, to be fair, I really do not like java, but for a number of otherwise irrelevant reasons to what is being discussed here, (specifically, it worries me that it took so long for them to get Event handling working in a sensible manner, or that they took years to add lambda's/closures because they claimed "functional programming is a fad" which is ironic since the tagline against java in the mid nineties was that "Object oriented programming is a fad".
The basic point is, optimizing minecraft is better achieved by the very methods outlined; better use of available functionality and (in the case of fog) not writing invisible triangles to the pipeline. Thing is, aside from that, most of the more common-sense optimizations have been made; a square wall of blocks of the same type is a single texture, for example; and there is backface culling, so if anything more recent versions of MC are faster then the old (although I cannot remember when backface culling was added). I don't know it it uses view frustrum culling or any of the other methods of reducing triangles, but if not, they should be working on adding that; porting it to another language takes time for no gain and will only serve to add bugs.
Which brings me to another point; there is a common mindset- and it's a feeling programmers get as well, that at some point in development, a program feels so messy and unmaintainable that it "is better to rewrite it" thing is, that's never the case. The reason a program feels messy and unmaintainable is almost always because of those changes made during development to fix bugs and other errata; the single biggest mistake any software firm can do with their flagship product is rewrite it from scratch; and that is what will essentially occur if they are to "port" it to C++. It's happened before; The reason netscape failed was because they tried to rewrite it. Borland bought Arago and tried to turn it into Dbase for windows as opposed to forking their current dbase code) which utterly failed because Access ate them for lunch. Microsoft almost made the same mistake, trying to rewrite Word for Windows from scratch in a doomed project called Pyramid which was shut down, thrown away, and swept under the rug.
With minecraft, what would happen if they rewrote minecraft in C++, is this:
All updates to Minecraft would stop. After all, there is no point maintaining the old sloppy java version. it's garbage now, the new C++ version will shine! So no new updates. You want basic weather? we aren't going back to that old icky java code, we're rewriting it in C++! they'll say. Days turn to weeks turn to months turn to years.
meanwhile, Manic Digger will likely have a revival as people miffed by being abandoned by Mojang rally behind it; new developers will strike into the fold, And it will become rampant with development. Each passing day, more people will "lose hope" that there will even be a C++ version of minecraft, as Manic Digger (or some other project) starts gaining more and more traction.
And what happens when Minecraft C++ edition is released? The best case scenario for Mojang is that it works exactly like the java version did, only slightly faster and with some new set of bugs. There won't have been any new content added, they were busy just duplicating what they already had. Meanwhile, Manic Digger (or again, whatever equivalent rises) will have passed what Minecraft was by leaps and bounds, very well making minecraft look to it as infiniminer does to minecraft. Now this isn't a bad thing at all; we'll still have "a minecraft" of sorts, and it will be open source and probably more easily modded. But I'm looking at this from Mojang's perspective.
As appears to be the custom, have a diamond:
Totally New at running servers? Click me!
I wonder if sansavarous is online...
Indeed. It really depends, if you're on top of a mountain on far distance you'll have quite a few triangles to draw before hitting the far-culling. And HL2 wasn't the best example as well, the source engine is quite efficient.
It is, indeed. For some reason, in SSP the fog system seems to cull appropriately but in SMP at it is right now, anything below far view distance just kills the performance.
Indeed, although you can still kind of see in which direction the code is going, even when it is obfuscated. Still, I don't like that closed source philosophy as well - mods have existed for as long as Minecraft has, Mojang might just as well release the code and make everyone's life easier (right?). Note that with another programming language, mods would still be doable if the code was released along with the game, but it would attract a totally different community of modders.
Maybe, although that would be strange coming from such a big library. It probably is just some bad code in the networking engine (with regard to the SMP issues above), maybe a typo?
Yeah I got a bit carried away with this particular model. My bad.
Yes, but I meant actually using them to make the game run faster. You can argue that anything is a pointer, since basically a variable such as a longword (in Delphi for instance) is just a pointer with one level of indirection. A method within a class is a pointer with two, sometimes more (if the method is virtual) levels of indirection.
Well, it depends what you are aiming for. If you are just hacking up a game (like Minecraft in its current state), you're just going to be allocating when you need and let the GC do its stuff. But if I'm coding something that needs to work as fast as possible, you better bet I'll ensure every GetMem or VirtualAlloc or whatever is paired with one, and only one, FreeMem or VirtualFree or whatever. No-one likes having an out of memory exception when even the pagefile can't take it anymore.
But I agree - in many ways it simplifies the game development process, and helps game developers focus on the game and not on some crappy arcane object that just won't free itself properly. But IMO it's just not a sustainable development model, and code needs to be reviewed and cleaned up every once in a while. With Minecraft I just get the nasty feeling the development team just kept adding more and more stuff on the base code until it became basically impossible to maintain and optimize - in this case there are two paths of action:
- "perhaps we should rewrite this"
- "it's working anyway, the hell with it"
Which brings us to your next point..
Rewriting from scratch is not an option once the project gets too big - if they are going to rewrite it, they are going to focus on optimization, and as someone (forgot the name) put it well, "premature optimization is the root of all evil" and they will produce an even bigger mess. And also for the reasons you outline above - complete rewrite is not really the best idea, yet I think some portions of code would benefit from a rewrite while others are best left alone.
Actually I think Minecraft handles the rendering part very well - at least on my systems. The real problem is not there, it's in the chunk update code, which takes too much processor time - and since Minecraft is fully singlethreaded, rendering is severely affected once this happens. This is where parallelization of the chunk update process comes in handy - but it requires rewriting a fair bit of the code, which may not be an option. Yet it needs to be done eventually, as CPU's are hitting thermal limits right now, while the Minecraft rendering distance may increase in the future i.e. more (and bigger) chunk updates. We have a problem.
I think we both agree on a point: complete rewrite is obviously not the way to go and will cause Minecraft to fail as proven many times in the past. But it's not binary, it's not either complete rewrite either nothing - it is also possible to have a look in the code, profile it, see where the biggest bottleneck is, and try and figure out how to get those fifty lines of code or so working faster (I'm thinking of the chunk update code here). Obviously any programmer worth his salt knows that when a program is running slowly, it's not because of the whole code - in any given program there's probably only 5% of the *total* code that is actually slowing down the whole of it. AFAIK the only complaints I have ever had and seen for Minecraft concern the chunk update & the SMP networking code. This is all that, apparently, needs to be looked over. And a couple of bugs such as the interface being unresponsive when going from fullscreen to windowed mode. All the rest works well and should be left alone.