Hey guys. So recently I had a problem with my operating system where it wouldn't open a single file. So I had to give it for repair and the guy had to uninstall and reinstall windows 10. I'm not sure what caused the problem but I was wondering if optifine can cause this problem on an nvidia card?
My friend told me nvidia cards are not meant for optimisation unlike other cards like amd. I have a gtx 1050ti. I was wondering if optifine can really cause such a problem and if I should install it back on too minecraft?
Btw I had optifine installed ever since I bought minecraft which was around 12-10 months ago. But the problem with my pc started 1 month to 3 weeks ago.
Assuming you downloaded Optifine from this page, https://optifine.net/downloads, ( use (mirror) to avoid adware) then NO, Optifine can't harm your computer.
I can't really find anything one way or the other regarding problems with Optifine and Nvidia GPUs.
You MIGHT need to download the latest version of 64 bit Java and force Minecraft to use that. (I do that as a matter of course.)
Try Minecraft with and without Optifine and see what happens.
Report back here if you discover anything interesting please.
Rollback Post to RevisionRollBack
There are no dangerous weapons. There are only dangerous people. R.A. Heinlein
If you aren't part of the solution, then you obviously weren't properly dissolved.
I've been using OptiFine for many, many, many years on my primary PC (GeForce GTX 560 Ti, GTS 650, and GTX 1060 6GB), my secondary PC (GeForce GT 430), a PC of a friend (GeForce GT 610, GT 240 [?], GT 740 [?]), my laptop (Intel onboard 4400 series), among others. As you can see, some I'm unsure of/forget exact model but there's been many across nearly half a dozen PCs and a number of users. I've yet to have it cause my operating system (or have any of the others mention to have had it cause it) to be unable to open a file. That sounds like a serious malware caused issue if anything.
There's actually a setting that does (or at least, did) only work when on nVidia hardware (may work on AMD now too?), and that is the fancy fog option. So, it'd be surprising if it "wasn't meant for nVidia". When I play on my laptop (stays nearly consistent at 60 FPS, some momentary hitches aside), the thing that actually bothers me the most isn't the smaller resolution, or lower render distance, or even lack of anit-aliasing; it's the way the fog behaves relative to the edge.
There's actually a setting that does (or at least, did) only work when on nVidia hardware (may work on AMD now too?), and that is the fancy fog option. So, it'd be surprising if it "wasn't meant for nVidia". When I play on my laptop (stays nearly consistent at 60 FPS, some momentary hitches aside), the thing that actually bothers me the most isn't the smaller resolution, or lower render distance, or even lack of anit-aliasing; it's the way the fog behaves relative to the edge.
This is due to the vanilla game and/or LWJGL using an Nvidia-specific OpenGL extension (there is no reason why other GPUs shouldn't be able to render spherical fog, nor do I see any difference in performance between even no fog and "fancy" fog, which is the default in vanilla regardless of the "fast/fancy" graphics setting if it is supported, so I doubt it has anything to do with performance):
Otherwise, at least in older versions, there may have been some truth to Optifine not working as well on Nvidia GPUs; prior to 1.8 the game had an "Advanced OpenGL" option which enabled hardware occlusion culling and this made a huge difference on my old computer, doubling FPS, much more than Optifine itself does - as it happens, Optifine added this itself before it was made a part of vanilla and I've always thought that this is where its claim of "doubled FPS" came from (in my experience the greatest benefit of Optifine is improved chunk rendering performance, as in chunk updates). However, it generally didn't work that well on Intel/AMD so I can see them benefiting more from Optifine (in 1.8 Mojang replaced AOGL with their own culling method due to the issues with non-Nvidia GPUs, where it often actually worsened performance).
The Advanced OpenGL setting in the older versions being turned on made a massive difference for me as well (nVidia GeForce GTX 560 Ti at the time). When that option later went away (starting with one of the snapshots for 1.8?), performance was better than were the option off, but worse than were it on.
It also appears to have more visual anomalies, and when they occur, they are completely consistent. There's spots in my world where, if if I view it from a certain other spot and angle, nothing is simply rendered (and you see the sky/void in that spot). As mentioned, it's consistent, meaning if it happens toi a spot when viewed from a spot, it will always happen (unless something is changed to that portion of the world to make it not happen). It's fairly rare, and usually a small potion, but it's still a rather serious bug, especially during the times it's not minor. It started with 1.8 for me and persists to 1.10 (not sure if 1.13 or 1.14 will have fixed it).
This is a side-effect of occlusion culling and also happens in older versions when Advanced OpenGL is enabled; basically, the game "cheats" by treating chunks as single 16x16x16 block cubes instead of checking every single block for visibility, which greatly speeds up the culling checks (I'm not sure about 1.8's implementation but AOGL first renders chunks as 16x16x16 cubes (not drawn to the screen) and checks if any parts are visible with the "GL_ARB_occlusion_query" OpenGL extension, then it renders the actual chunks based on the visibility results).
I don't know about newer versions but I never experience rendering issues with Optifine (I recall that it was quite bad in 1.5.x, especially while boating around in a survival island world I was playing on, and this was one reason why I first installed Optifine). I've also noticed that the occlusion culling in modern versions sometimes malfunctions on a large scale, causing large areas to flash between invisible and visible even when they are clearly visible (I looked for a bug report for this but they all seem to be marked as duplicates of MC-62958, which is definitely not the correct issue, while MC-149178 describes rendering issues in 1.14 (not 1.14.1+), but I recall seeing this as early as 1.8).
It never happened for me in versions prior to 1.8 though, and I played with Advanced OpenGL on. The anomalies I'm seeing only started with 1.8 and after.
That first link you linked to (for versions prior to 1.8) seems to be a broader and/or different issue. They both may be occlusion culling issues but it claims "relogging fixes the issue" whereas the issue I am referring to is absolutely consistent. As in, if the visual anomaly occurs, it will ALWAYS occur unless the terrain between the viewing location and affected location is changed in a way to make it not happen. Relogging will not change this.
As for OptiFine, the only issues I ever had with it was the old multi-core rendering issue in way older versions that needed an nVidia driver setting set a certain way, which would correct it. No idea if that's even still the case.
Hey guys. So recently I had a problem with my operating system where it wouldn't open a single file. So I had to give it for repair and the guy had to uninstall and reinstall windows 10. I'm not sure what caused the problem but I was wondering if optifine can cause this problem on an nvidia card?
My friend told me nvidia cards are not meant for optimisation unlike other cards like amd. I have a gtx 1050ti. I was wondering if optifine can really cause such a problem and if I should install it back on too minecraft?
Btw I had optifine installed ever since I bought minecraft which was around 12-10 months ago. But the problem with my pc started 1 month to 3 weeks ago.
Assuming you downloaded Optifine from this page, https://optifine.net/downloads, ( use (mirror) to avoid adware) then NO, Optifine can't harm your computer.
I can't really find anything one way or the other regarding problems with Optifine and Nvidia GPUs.
You MIGHT need to download the latest version of 64 bit Java and force Minecraft to use that. (I do that as a matter of course.)
Try Minecraft with and without Optifine and see what happens.
Report back here if you discover anything interesting please.
There are no dangerous weapons. There are only dangerous people. R.A. Heinlein
If you aren't part of the solution, then you obviously weren't properly dissolved.
The latest release of Amidst, version 4.6 can be found here:
https://github.com/toolbox4minecraft/amidst/releases
You should probably also read this:
https://www.minecraftforum.net/forums/mapping-and-modding-java-edition/minecraft-tools/2970854-amidst-map-explorer-for-minecraft-1-14
You can find me on the Minecraft Forums Discord server.
https://discord.gg/wGrQNKX
I've been using OptiFine for many, many, many years on my primary PC (GeForce GTX 560 Ti, GTS 650, and GTX 1060 6GB), my secondary PC (GeForce GT 430), a PC of a friend (GeForce GT 610, GT 240 [?], GT 740 [?]), my laptop (Intel onboard 4400 series), among others. As you can see, some I'm unsure of/forget exact model but there's been many across nearly half a dozen PCs and a number of users. I've yet to have it cause my operating system (or have any of the others mention to have had it cause it) to be unable to open a file. That sounds like a serious malware caused issue if anything.
There's actually a setting that does (or at least, did) only work when on nVidia hardware (may work on AMD now too?), and that is the fancy fog option. So, it'd be surprising if it "wasn't meant for nVidia". When I play on my laptop (stays nearly consistent at 60 FPS, some momentary hitches aside), the thing that actually bothers me the most isn't the smaller resolution, or lower render distance, or even lack of anit-aliasing; it's the way the fog behaves relative to the edge.
This is due to the vanilla game and/or LWJGL using an Nvidia-specific OpenGL extension (there is no reason why other GPUs shouldn't be able to render spherical fog, nor do I see any difference in performance between even no fog and "fancy" fog, which is the default in vanilla regardless of the "fast/fancy" graphics setting if it is supported, so I doubt it has anything to do with performance):
MC-32452 Fog generates only in the centre of the screen
Otherwise, at least in older versions, there may have been some truth to Optifine not working as well on Nvidia GPUs; prior to 1.8 the game had an "Advanced OpenGL" option which enabled hardware occlusion culling and this made a huge difference on my old computer, doubling FPS, much more than Optifine itself does - as it happens, Optifine added this itself before it was made a part of vanilla and I've always thought that this is where its claim of "doubled FPS" came from (in my experience the greatest benefit of Optifine is improved chunk rendering performance, as in chunk updates). However, it generally didn't work that well on Intel/AMD so I can see them benefiting more from Optifine (in 1.8 Mojang replaced AOGL with their own culling method due to the issues with non-Nvidia GPUs, where it often actually worsened performance).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
Interesting.
The Advanced OpenGL setting in the older versions being turned on made a massive difference for me as well (nVidia GeForce GTX 560 Ti at the time). When that option later went away (starting with one of the snapshots for 1.8?), performance was better than were the option off, but worse than were it on.
It also appears to have more visual anomalies, and when they occur, they are completely consistent. There's spots in my world where, if if I view it from a certain other spot and angle, nothing is simply rendered (and you see the sky/void in that spot). As mentioned, it's consistent, meaning if it happens toi a spot when viewed from a spot, it will always happen (unless something is changed to that portion of the world to make it not happen). It's fairly rare, and usually a small potion, but it's still a rather serious bug, especially during the times it's not minor. It started with 1.8 for me and persists to 1.10 (not sure if 1.13 or 1.14 will have fixed it).
I see this effect also playing 1.14.4
This is a side-effect of occlusion culling and also happens in older versions when Advanced OpenGL is enabled; basically, the game "cheats" by treating chunks as single 16x16x16 block cubes instead of checking every single block for visibility, which greatly speeds up the culling checks (I'm not sure about 1.8's implementation but AOGL first renders chunks as 16x16x16 cubes (not drawn to the screen) and checks if any parts are visible with the "GL_ARB_occlusion_query" OpenGL extension, then it renders the actual chunks based on the visibility results).
See:
MC-129 Chunks not loading surface, revealing caves, etc. (for versions before 1.8)
MC-63020 Chunks are not rendered in first person in some cases (for 1.8 and later, still unresolved)
I don't know about newer versions but I never experience rendering issues with Optifine (I recall that it was quite bad in 1.5.x, especially while boating around in a survival island world I was playing on, and this was one reason why I first installed Optifine). I've also noticed that the occlusion culling in modern versions sometimes malfunctions on a large scale, causing large areas to flash between invisible and visible even when they are clearly visible (I looked for a bug report for this but they all seem to be marked as duplicates of MC-62958, which is definitely not the correct issue, while MC-149178 describes rendering issues in 1.14 (not 1.14.1+), but I recall seeing this as early as 1.8).
TheMasterCaver's First World - possibly the most caved-out world in Minecraft history - includes world download.
TheMasterCaver's World - my own version of Minecraft largely based on my views of how the game should have evolved since 1.6.4.
Why do I still play in 1.6.4?
It never happened for me in versions prior to 1.8 though, and I played with Advanced OpenGL on. The anomalies I'm seeing only started with 1.8 and after.
That first link you linked to (for versions prior to 1.8) seems to be a broader and/or different issue. They both may be occlusion culling issues but it claims "relogging fixes the issue" whereas the issue I am referring to is absolutely consistent. As in, if the visual anomaly occurs, it will ALWAYS occur unless the terrain between the viewing location and affected location is changed in a way to make it not happen. Relogging will not change this.
As for OptiFine, the only issues I ever had with it was the old multi-core rendering issue in way older versions that needed an nVidia driver setting set a certain way, which would correct it. No idea if that's even still the case.