r4z0r84 said:
Have you only played one PC game in your life?
I studied electronics, my first computer was a 1980s TI99a, and I built my own current PC gaming rig which can max out any game except for Metro 2033.
r4z0r84 said:
Yes some PC games are Coded horribly and will not go above 30fps for example Need For Speed Underground, by default it is capped and designed to run at 30fps, force off vsync and you get thousands of frames per second.
60 fps is recommended for first person shooters as a MINIMUM
30 fps is recommended for flying/racing games as a MINIMUM
This is backwards. Crysis, for example, is a first person shooter designed specifically to be played at 30fps. Any argument otherwise is patently absurd as any review of the game will verify. It certainly plays better at 50+ fps, but the game was designed with effects like motion blur so it could be played at lower frame rates. However, those same effects like motion blur make higher frame rates and faster game play more difficult which is the point I have been making all along.
Other games like quake that are designed specifically for 60fps and up are often played competitively and players will use 120hz monitors and 180fps just to smooth out any remaining bumps in the program so they can get that extra one hundred of a second response time over the competition.
r4z0r84 said:
Most true PC gamers NOT console gamers like there games to have "no lag" on maximum settings. photo realistic images without stutter well and above 300fps yes you can live with it on a screen/lcd/monitor/tv if it is running at the minimum as stated above but, no one in there right mind would enjoy it compared to running the game optimally.
Lag is an unavoidable part of life. Everything from your input devices to the hard drive to the ram, cpu, gpu, and monitor contribute to lag and until someone invents radically different technology we will always have lag. Ideally the entire system would produce no more than 2ms of lag which is the fastest human response time, while the LCD response times of most monitors alone are lucky to be that fast. With the increased layers of abstraction in PCs they tend to have more lag time than console games unless the console game deliberately introduces lag. Upwards of 20ms to over 100ms which is absurd considering again that the human response times is roughly 2ms.
r4z0r84 said:
Rage is a horrific example or a perfect example of horrible coding.
Rage is a next generation engine capable of getting 30fps on something as wimpy as an iPhone and 60fps on everything else. You may not appreciate speed since you don't even seem to understand what lag is, but id fans do. There is only so much video game developers an do about lag that is built into hardware and other people's software like drivers, but they can certainly compensate for it with higher fps. I could go into the details of just how much of a next generation engine the id tech 5 is, but that would require many pages of details. Again, these are all facts that are easily verifiable by any review of the game.
r4z0r84 said:
Look at brink another game that is hard coded to have mouse lag, response time would make it close to impossible to have that game run VR as it is hard coded to be bad. (only had consoles in mind)
there is no point looking at fail engines to run VR i understand that, like rage, brink, gta4, saints row2, all non UT3 engine games, non relic/id games, all EA games, and a few ubisoft games(majority before EA took over were great engines)
I've already told you the first game to be optimized to use the Oculus Rift is Doom 4 which uses the same engine as Rage. Speed and is the issue and being able to produce higher frames per second is one of the few ways to compensate for lag. This is something many modern gamers simply do not appreciate is what it takes to make engines faster and yet capable of providing cutting edge graphics.
r4z0r84 said:
Cryengine 3 may be a horribly coded engine like cryengine 2, the photo realism from the tech trailers i have seen would make me believe that running around in that environment with VR would be mind blowing, personally i would think the FrostByte 2 engine is another nice looking engine.
No one disputes they are nice looking but, again, they are not built for speed. They are built to produce better graphics at lower speeds. A bit like the difference between a race car and a low rider. One is built for looks and fancy tricks at slow speeds, while the other is made for racing. Certainly if you put a powerful enough engine in a low rider it will go faster, but it will never be a race car.
r4z0r84 said:
Personally speaking from a former games design special effects artist (4 years job experience) you do not understand that 99% of games today are designed with consoles in mind due to the falling out of the PC games market, they now get ported back to PC meaning yes the 60 and 30 fps limits are mostly hard coded into the game.
I am perfectly aware 99% of most games are designed with consoles in mind and I would appreciate it if you stop trying to tell ME what I know and don't know. If I want someone to do that I'll contact the psychic hotline.
r4z0r84 said:
Its like trying to compare a Nvidia graphics card to a AMD/ATI graphics card, yes there hardware may be extremely close but it comes down to the drivers as to what one comes out on top.
same as the games, you can have something that looks horrible lag like crazy Diablo 3, has the exact same gameplay with more loading screens as a game that came out in 2006, Titan quest for example has better graphics and the same gameplay, no loading screens (apart from porting) compare that to diablo 3 and there is a massive amount of coding difference plus budget, Blizzard/activision was mainly focused on making money with Diablo 3 hence majority of it was based around the real life auction house.
I don't understand how you can state that all games are designed for 30fps or 60fps, possibly all the recently released games yeah i can agree with that, but the older games that are coded correctly run a hell of a lot faster then 30fps plus you don't get vertical tearing, have a look at the original Unreal Tournament engine, 999 fps+ no screen tears definitely not photo realistic but it can get well over the fps, if you have forced vsync on you would need a better monitor to have a higher fps, even though there is no real point.
Older games were designed to run faster because they did not possesses tricks like motion blur to make them look decent at slower frame rates. Crysis was the first game to actually showcase such technology in a big way and it became pretty standard because improved graphics will sell games faster than anything else. If improving the graphics meant having to include things like motion blur so it did not look like crap at lower frame rates that's what they did despite the fact motion blur doesn't exist in real life!
This is the same issue the movie industry has dealt with for a hundred years. Pan a movie camera fast and the background blurs not because that's what the human eye sees, but because the camera simply does not take enough frames per second. Audiences have become so used to the effect and others like it that cinematographers routinely ham it up and such effects as artistic expressions and to set the mood in films, but it a completely artificial convention essentially no different than adding animation or whatever. The entire problem could easily be avoided by simply filming the movie at higher frame rates, but it adds expense.
r4z0r84 said:
What i would love to know about the Oculus is if it has anything to do with the Occulus instance in world of warcraft, and if blizzard will want to sue for copyright infringement, other than that the resolution of the screens used, is the pixel structure visible?
Google really is your friend and a quick search will show you oculus is the Latin word for eye. I'm sure Blizzard and other companies might like to copyright the entire Latin language, but it just isn't going to happen.