You do know that more FPS doesn't mean no lag, right?
I mean that It can still lag, it can fluctuate from 290 to 310. That could cause lag.
If you had 300 FPS, You WILL lag because your refresh rate of the monitor can't keep up.
Put on V-sync and you will get 60-120 (depending on the monitor) FPS at max.
Skipping frames and having a slow game. Either that or internet problems.
I'm usually quiet about this, but This is one of my "peeves" as it were, because "Lag" refers to latency, and latency only has any significant delay if it doesn't involve a network. There is something annoying about people who somehow conflate FPS issues and "lag" because the latter still exists and now we no longer have a word for it because apparently "lag" could refer to either poor system performance on the client side OR it could refer to network latency. A lot of times people will come in with some silly "math" about how low FPS is "technically" a form of lag. That is sort of like saying that, technically, a airplane is an automobile. Even if it's technically correct the term is providing less information because by using it for more than one thing it's lost precision.
Speaking from Experience with Quake I Multiplayer. Low FPS was common in certain levels in certain areas because of the more complex geometry and the low FPS didn't affect other players excepting those with similar hardware. With good enough hardware the system was able to keep up.
"Lag" differed from this and at the time was never confused with low FPS because the games performance on the client remained unchanged- only the server updates were latent.
So ,for example: at low FPS you might end up playing at 10 fps. This can affect gameplay and your ability to play but it wasn't lag- it was low FPS.
In contrast, Latency would not affect your FPS- you could be playing in the abandoned base in a firefight, You are chasing somebody down a hallway with a grenade launcher, and at a T intersection they go straight. You follow, then they disappear and suddenly they are behind you.
That was lag because from their perspective they turned at the intersection. Because you were not receiving server game states fast enough, your client predicted that they would keep moving in the direction they were going (straight); then it received a game update from the server with the players actual position and he was behind you.
Lag was altogether far more infurating than low FPS because it caused confusing scenarios like this- players could duck around one corner from your perspective but have gone the other way. Not to mention when such latency affects things such as rocket jumping.
It has become common to refer to Low FPS situations as "lag" but the fact is that this clouds the term, particularly since modern games still suffer from the problems that are solely the domain of network lag (with minecraft, blocks not breaking or breaking in a delayed fashion, or reappearing, etc. are all lag).
For Minecraft, players have come to call it "FPS lag" and "Block Lag" but the fact is that there is no "FPS Lag".
the common counter-argument is that, because there is latency between when a player presses something and they see a result on the screen, that a low FPS is lag.
This fails to consider that this means there will always be a 1/60'th of a second amount of lag that is unavoidable. It also fails to take into account that lag, for a number of years, referred exclusively to network latency and did not have any usage outside of networks. Even with the original IBM PC the slow operation of a long "DIR" command was never called "Lag". It's not solely a consideration of a word being used wrong, either- it's a word being used in such a way that the word becomes completely imprecise. if lag can mean both "network latency" and "Low FPS" than the word becomes useless because those two situations have completely different causes, effects, and possible solutions.
Worse still is when we introduce vertical sync.
Most people have LCD displays nowadays, which causes the whole VSync thing to get even more confusing, because Vertical Sync is, quite literally, something that only "exists" in a real sense with CRT technologies- after all the term itself refers to how quickly the electron beam makes a full sweep from top to bottom. Since electron beams don't exist on other display types, "Vertical Sync" has no real-world equivalent and instead is something completely harboured within the DAC chip of the Graphics adapter. Worse still, that Digital to Analog Chip is only ever employed for creating an analog signal- so if you connect your monitor digitally with a properly supported DVI cable, there is literally no Vertical sync.
This begs the question, then- wtf does Vertical sync mean in that case?
Well, for Analog, standard VGA signals, the Vertical Blank signal is indistinguishable from the signal that would represent black. This isn't a problem for CRT displays since that Vertical blank is to allow the electron gun to reposition itself back at the top of the screen (of course, the electron gun doesn't actually change it's direction, rather the magnetic yoke surrounding the cathode ray tube has the input voltages adjusted to move the beam). for LCD displays, this is only an issue when it uses VGA input and has to decode a VGA signal with it's Analog to digital converter. I do not know the technology behind it but I would hazard a guess that some of this is part of what the various settings such as clock skew refer to.
As for screen-tearing on DVI-D, I am unaware of specifically how the data is sent digitally, but for CRT displays and analog data it is sent in raster scanlines. Tearing results if the Buffer is flipped while a frame is being written. VSync synchronizes the buffer flipping for the RAMDAC with the VBlank signal from the monitor. For Digital displays using DVI-D I would expect the problem to be eliminated since the pixel data is sent in a different fashion and is not encoded and decoded in a form designed for reproduction by analog equipment. I haven't experienced any Screen tearing on my 2560x1440 monitor using a DVI-D Dual-Link Cable to my GTX 770 even when Framerates far exceed 60 (the setting I have for the refresh rate), but it was fairly common on my VGA-Connected 1440x900 screen with the same adapter and same applications and games.
I know my reply is fairly late. But a $700 budget is not going to get what you need.
You'll need 2 different storage devices, since having one in raid 0 mode, and having fraps record to a different disk location, really speeds up the fps upon recording. Ecsp on a shaders.
You will need a pretty beefy cpu. I'd recommend a intel i5 4670k. Delivers the best gaming perfomance for a cpu in my opinion, and preforms a bit better than the i7-4770k gaming wise. Also, I'd recommend going with a GTX 770 or 780 graphics. The graphics alone will cost you around $330-$450. Now minecraft isn't a very GPU heavy game, but it does use GL... So a CPU is a must for sure.
My YouTube Channel --->https://www.youtube.com/channel/UCM70mQPHXT9RC8skS5pK6Vg
I mean that It can still lag, it can fluctuate from 290 to 310. That could cause lag.
If you had 300 FPS, You WILL lag because your refresh rate of the monitor can't keep up.
Put on V-sync and you will get 60-120 (depending on the monitor) FPS at max.
Exactly what is your definition of lag?
Skipping frames and having a slow game. Either that or internet problems.
Skipping frames.
Also, I know it was last commented on in 2013, but still a useful tidbit of advice.
I'm usually quiet about this, but This is one of my "peeves" as it were, because "Lag" refers to latency, and latency only has any significant delay if it doesn't involve a network. There is something annoying about people who somehow conflate FPS issues and "lag" because the latter still exists and now we no longer have a word for it because apparently "lag" could refer to either poor system performance on the client side OR it could refer to network latency. A lot of times people will come in with some silly "math" about how low FPS is "technically" a form of lag. That is sort of like saying that, technically, a airplane is an automobile. Even if it's technically correct the term is providing less information because by using it for more than one thing it's lost precision.
Speaking from Experience with Quake I Multiplayer. Low FPS was common in certain levels in certain areas because of the more complex geometry and the low FPS didn't affect other players excepting those with similar hardware. With good enough hardware the system was able to keep up.
"Lag" differed from this and at the time was never confused with low FPS because the games performance on the client remained unchanged- only the server updates were latent.
So ,for example: at low FPS you might end up playing at 10 fps. This can affect gameplay and your ability to play but it wasn't lag- it was low FPS.
In contrast, Latency would not affect your FPS- you could be playing in the abandoned base in a firefight, You are chasing somebody down a hallway with a grenade launcher, and at a T intersection they go straight. You follow, then they disappear and suddenly they are behind you.
That was lag because from their perspective they turned at the intersection. Because you were not receiving server game states fast enough, your client predicted that they would keep moving in the direction they were going (straight); then it received a game update from the server with the players actual position and he was behind you.
Lag was altogether far more infurating than low FPS because it caused confusing scenarios like this- players could duck around one corner from your perspective but have gone the other way. Not to mention when such latency affects things such as rocket jumping.
It has become common to refer to Low FPS situations as "lag" but the fact is that this clouds the term, particularly since modern games still suffer from the problems that are solely the domain of network lag (with minecraft, blocks not breaking or breaking in a delayed fashion, or reappearing, etc. are all lag).
For Minecraft, players have come to call it "FPS lag" and "Block Lag" but the fact is that there is no "FPS Lag".
the common counter-argument is that, because there is latency between when a player presses something and they see a result on the screen, that a low FPS is lag.
This fails to consider that this means there will always be a 1/60'th of a second amount of lag that is unavoidable. It also fails to take into account that lag, for a number of years, referred exclusively to network latency and did not have any usage outside of networks. Even with the original IBM PC the slow operation of a long "DIR" command was never called "Lag". It's not solely a consideration of a word being used wrong, either- it's a word being used in such a way that the word becomes completely imprecise. if lag can mean both "network latency" and "Low FPS" than the word becomes useless because those two situations have completely different causes, effects, and possible solutions.
Worse still is when we introduce vertical sync.
Most people have LCD displays nowadays, which causes the whole VSync thing to get even more confusing, because Vertical Sync is, quite literally, something that only "exists" in a real sense with CRT technologies- after all the term itself refers to how quickly the electron beam makes a full sweep from top to bottom. Since electron beams don't exist on other display types, "Vertical Sync" has no real-world equivalent and instead is something completely harboured within the DAC chip of the Graphics adapter. Worse still, that Digital to Analog Chip is only ever employed for creating an analog signal- so if you connect your monitor digitally with a properly supported DVI cable, there is literally no Vertical sync.
This begs the question, then- wtf does Vertical sync mean in that case?
Well, for Analog, standard VGA signals, the Vertical Blank signal is indistinguishable from the signal that would represent black. This isn't a problem for CRT displays since that Vertical blank is to allow the electron gun to reposition itself back at the top of the screen (of course, the electron gun doesn't actually change it's direction, rather the magnetic yoke surrounding the cathode ray tube has the input voltages adjusted to move the beam). for LCD displays, this is only an issue when it uses VGA input and has to decode a VGA signal with it's Analog to digital converter. I do not know the technology behind it but I would hazard a guess that some of this is part of what the various settings such as clock skew refer to.
As for screen-tearing on DVI-D, I am unaware of specifically how the data is sent digitally, but for CRT displays and analog data it is sent in raster scanlines. Tearing results if the Buffer is flipped while a frame is being written. VSync synchronizes the buffer flipping for the RAMDAC with the VBlank signal from the monitor. For Digital displays using DVI-D I would expect the problem to be eliminated since the pixel data is sent in a different fashion and is not encoded and decoded in a form designed for reproduction by analog equipment. I haven't experienced any Screen tearing on my 2560x1440 monitor using a DVI-D Dual-Link Cable to my GTX 770 even when Framerates far exceed 60 (the setting I have for the refresh rate), but it was fairly common on my VGA-Connected 1440x900 screen with the same adapter and same applications and games.
"Recording and a realistic texturepack are going to drop you like from 120 to 60" - Gregrs 2014
My YouTube Channel --->https://www.youtube.com/channel/UCM70mQPHXT9RC8skS5pK6Vg
You'll need 2 different storage devices, since having one in raid 0 mode, and having fraps record to a different disk location, really speeds up the fps upon recording. Ecsp on a shaders.
You will need a pretty beefy cpu. I'd recommend a intel i5 4670k. Delivers the best gaming perfomance for a cpu in my opinion, and preforms a bit better than the i7-4770k gaming wise. Also, I'd recommend going with a GTX 770 or 780 graphics. The graphics alone will cost you around $330-$450. Now minecraft isn't a very GPU heavy game, but it does use GL... So a CPU is a must for sure.
i run about 90-110 fps playing mine craft with max settings and no texture pack
however this computer can't run shaders and is $2,000
My $1700 High End Intel/nVidia rig runs shaders maxed out, but it plays it at 60-70 FPS