can someone help me every time I use the shader pack this happens
ATTACHMENTS
2014-09-01_16.42.18
Rollback Post to RevisionRollBack
If Facebook, Myspace, and Twitter were all destroyed, 99% of teens would go insane. If you're one of the 1% that would be laughing at them, copy this into your signature and hope it happens.
Then go into the Control Panel -> Programs and Features, find Intel HD Graphics, click on it, and uninstall. Reboot if needed. Then install the downloaded drivers, and DO NOT UPDATE.
From there, try 1.03 Intel or Basic, or Mr. Meeps, or De Delners CUDA Shaders.
1. Do you have MAC_LAG_FIX enabled? Read the first post for instructions.
2. 8GB is a lot. Even if you are using a big modpack. Just throwing RAM at minecraft/forge isn't going to help much. Try changing the JVM arguments in the launcher profille to the following:
I found these on Reddit somewhere (might have been the FTB subreddit), so I don't know exactly what they all do. All I know is they work, and Forge is way more stable with them.
I'm getting extreme lag even though my computer should be able to handle it. The mac work around fix adds 5-10 fps, but it's still completely unplayable.
Edit: I'm using vibrant 1.051 ultra
Specs:
Processor: 3.4 GHz Intel core i5
Memory: 16 GB 1600 MHz DDR3 (8GB allocated to minecraft)
Graphics: NVIDIA GeForce GTX 780M 4096 MB
OS X 10.9.4
Log:
Java HotSpot(TM) 64-Bit Server VM warning: Using incremental CMS is deprecated and will likely be removed in a future release
[20:58:31] [main/INFO]: Loading tweak class name optifine.OptiFineTweaker
[20:58:31] [main/INFO]: Using primary tweak class name optifine.OptiFineTweaker
[20:58:31] [main/INFO]: Loading tweak class name shadersmodcore.loading.SMCTweaker
[20:58:31] [main/INFO]: Calling tweak class optifine.OptiFineTweaker
OptiFineTweaker: acceptOptions
OptiFineTweaker: injectIntoClassLoader skipped, OptiFine is loaded as a library
[20:58:31] [main/INFO]: Calling tweak class shadersmodcore.loading.SMCTweaker
OptiFineTweaker: getLaunchArguments
OptiFineTweaker: getLaunchTarget
[20:58:31] [main/INFO]: Launching wrapped minecraft {net.minecraft.client.main.Main}
[SMC FNE]transforming bao bao
[SMC FNE] 50925 (+59)
[20:58:31] [main/INFO]: Setting user: charlesjr3
[20:58:31] [main/INFO]: (Session ID is token:602393121589478484fd35d85e66cbd1:e1bd42441e0b4ff5a231ec72aff3f928)
[SMC FNE]transforming aji aji
[SMC INF] blockAoLight
[SMC FNE] 28869 (+61)
[SMC FNE]transforming abh abh
[SMC FNE] 2482 (+0)
Completely ignored arguments: []
[OptiFine] (Reflector) Method not pesent: aji.func_149713_g
[20:58:32] [Client thread/INFO]: LWJGL Version: 2.9.1
[SMC FNE]transforming buu buu
[SMC FNT] set activeTexUnit
[SMC FNE] 14412 (+65)
[OptiFine]
[OptiFine] OptiFine_1.7.10_HD_U_A4
[OptiFine] Mon Aug 25 20:58:32 MST 2014
[OptiFine] OS: Mac OS X (x86_64) version 10.9.4
[OptiFine] Java: 1.8.0_05, Oracle Corporation
[OptiFine] VM: Java HotSpot(TM) 64-Bit Server VM (mixed mode), Oracle Corporation
[OptiFine] LWJGL: 2.9.1
[OptiFine] OpenGL: NVIDIA GeForce GTX 780M OpenGL Engine version 2.1 NVIDIA-8.26.26 310.40.45f01, NVIDIA Corporation
[OptiFine] OpenGL Version: 2.1
[OptiFine] Maximum texture size: 16384x16384
[OptiFine] Checking for new version
[SMC FNE]transforming bpq bpq
[SMC FNE] 912 (+136)
[SMC FNE]transforming bpp bpp
[SMC FNE] 714 (+276)
[SMC FNE]transforming bqh bqh
[SMC FNE] 180 (+63)
[SMC FNE]transforming bpz bpz
[SMC FNT] loadRes
[SMC FNT] loadRes
[SMC FNT] allocateTextureMap
[SMC FNT] setSprite setIconName
[SMC FNT] uploadTexSubForLoadAtlas
[SMC FNE] 18066 (+458)
[SMC FNE]transforming bno bno
[SMC FNR] conditionally skip default shadow
[SMC FNE] 6686 (+78)
[SMC FNE]transforming blm blm
[SMC FNE] 151143 (+405)
[20:58:32] [Client thread/INFO]: Reloading ResourceManager: Default
[SMC FNE]transforming bqf bqf
[SMC FNE] 5679 (+104)
[OptiFine] *** Reloading textures ***
[OptiFine] Resource packs: Default
[SMC FNE]transforming bmh bmh
[SMC FNE] 8557 (-974)
I am curious if this is a result of the MacOS limiting you to OpenGL 2.1 when your card is capable of 4+.
The JAR is the same for MacOS and Windows, but when you run it in MacOS, the way LWJGL is called and the libraries run you are limited to OpenGL 2.1 (in windows it goes up to 3.2+ with backwards compatibility to 2.1).
Mind you, my machine (i7-2600K, 16GB DDR3-1866, 780ti 3GB) will run at ~30 FPS at 1080p / far view distance with the Ultra 1.051 shaders.
When you say lag, you mean FPS drops, or missing chunks, or can you explain further?
I use Optifine (as do you), but I enable fast math / fast render / smooth world. I do not have smooth fps or dynamic/preloaded chunks at all.
I am curious if this is a result of the MacOS limiting you to OpenGL 2.1 when your card is capable of 4+.
The JAR is the same for MacOS and Windows, but when you run it in MacOS, the way LWJGL is called and the libraries run you are limited to OpenGL 2.1 (in windows it goes up to 3.2+ with backwards compatibility to 2.1).
Mind you, my machine (i7-2600K, 16GB DDR3-1866, 780ti 3GB) will run at ~30 FPS at 1080p / far view distance with the Ultra 1.051 shaders.
When you say lag, you mean FPS drops, or missing chunks, or can you explain further?
I use Optifine (as do you), but I enable fast math / fast render / smooth world. I do not have smooth fps or dynamic/preloaded chunks at all.
I tried your settings and still get ~10 Fps on the default resolution.
I tried your settings and still get ~10 Fps on the default resolution.
What is the default resolution on your MAC?
If you have a retina screen, your resolution is 2560x1600 or 2880x1800 (depending on if you are running the 13" or 15" screen), instead of the typical 1920x1080.
That would explain the massive FPS drops a bit more...
I support:
Sildurs Vibrant Shaders v1.03 Intel http://pastebin.com/UQZ0fX0p
Sildurs Basic Shaders v1.04 Regular http://pastebin.com/nDa7U7Fc
Updating my driver to 10.18.10.3621
EDIT: Updating the drivers didn't change anything.
I support:
:soil::soil::soil::soil::soil::soil::soil::soil:
:stone::stone::coalore::stone::stone::stone::stone::stone:
:stone::coalore::coalore::stone::gravel::stone::ironore:stone:
I don't know what to tell you. I have Intel HD 4000. Graphic driver updating?
Try the Intel version. I had this exact same error when using SUES.
I support:
I don't know, maybe, cuz I m using ultra too, and I don't see this problem.
I don't know the different between high and ultra, but yes, I do use optifine.
Doea anybody know how to make the bloom thing on torch brighter?
Bloom, because I see the screenshot on first page of this thread and I saw the torches on the tree have better bloom than mine.
This is a very typical Intel error that happens with most shaderpacks.
The problem is any driver version ending in 34XX or 36XX, in this case, updating is BAD!
Sildur's 1.05 do not work on Intel yet, the Basic do, and 1.03 Intel is hit-and-miss (usually works, may be slow depending on CPU and RAM).
You'll want to download the appropriate 32-bit or 64-bit driver here:
http://www.guru3d.com/files-details/intel-hd-graphics-driver-download.html
Then go into the Control Panel -> Programs and Features, find Intel HD Graphics, click on it, and uninstall. Reboot if needed. Then install the downloaded drivers, and DO NOT UPDATE.
From there, try 1.03 Intel or Basic, or Mr. Meeps, or De Delners CUDA Shaders.
Do you happen to use Player API / Smart Moving?
Can you post a log, than we can see if any errors are popping up.
Load up MC, select a shaderpack, load a world, save and quit, and then exit MC.
Then go to %appdata%\.minecraft\logs and post fml-client-latest.log to paste.ee and link it here.
Tried all of those, its still unplayably laggy.
CLICK FOR BEST SUPPORT WEBSITE EVER!
Make sure to click the reply or quote button on my post if you are replying to me, otherwise it is unlikely that I will see your post.
I am curious if this is a result of the MacOS limiting you to OpenGL 2.1 when your card is capable of 4+.
The JAR is the same for MacOS and Windows, but when you run it in MacOS, the way LWJGL is called and the libraries run you are limited to OpenGL 2.1 (in windows it goes up to 3.2+ with backwards compatibility to 2.1).
Mind you, my machine (i7-2600K, 16GB DDR3-1866, 780ti 3GB) will run at ~30 FPS at 1080p / far view distance with the Ultra 1.051 shaders.
When you say lag, you mean FPS drops, or missing chunks, or can you explain further?
I use Optifine (as do you), but I enable fast math / fast render / smooth world. I do not have smooth fps or dynamic/preloaded chunks at all.
Mines don't lag at all, and I can even play hunger games with the shaders on, and I am using Mac too.
I tried your settings and still get ~10 Fps on the default resolution.
CLICK FOR BEST SUPPORT WEBSITE EVER!
Make sure to click the reply or quote button on my post if you are replying to me, otherwise it is unlikely that I will see your post.
What is the default resolution on your MAC?
If you have a retina screen, your resolution is 2560x1600 or 2880x1800 (depending on if you are running the 13" or 15" screen), instead of the typical 1920x1080.
That would explain the massive FPS drops a bit more...
I'm on a mac btw
What version of the shader