Video games have, since their conception, been limited by the specifications of the consoles/ platforms that they've been designed for. Consoles have been designed and redesigned countless times, reformed after numerous upgrades and fixtures to the newest models. Newer consoles have consistently upgraded both graphics and their CPU's, in order to show off newer generation games.
While arcade games were played on their own separate machines, perhaps a good starting point is the Sega Dreamcast (1999). While it was a simple console, it had add-ons for the controller to try and enhance a game's impact, and had up to four controller slots for people to play on. In that same decade, the SNES came out in 1992 as one of the best selling 16-bit consoles, having two controller ports and 128 kb of main RAM for the system. Although simple, these proved to be good starting points for later consoles, although they weren't the first ones on the market.
In 1994 the PlayStation had arrived, with the standard two controller ports and the option to keep save data on memory cards. While the system still used 32-bit, it was overshadowed in 1996 by the N64, using the newer 64-bit graphics. Unlike the PlayStation, however, the N64 kept save data in the cartridge itself, meaning there was one less item to bring when it came to playing games at friend's houses, although these cartridges proved to be more fragile than originally intended, and often needed 'cleaning' by blowing the inside (although there are actually warning not to do this, many often did it anyways due to often positive results). This was the marking of the transition to 64-bit graphics, and the new consoles CPU's kept up in this regard as well, allowing for more visually pleasing, immersive games.
Come 2000, the Playstation 2 had come, followed in 2001 by Nintendo's sequel to the N64, the Gamecube, as well as a new contender, the Xbox. These main three consoles held a vast majority of the console market at the time, and competed to have the best graphics, the best CPU's, and the most stable UI's for gaming. Years later, this continued to the release of the Xbox 360 (2005), Wii (2006), and Playstation 3 (2006). This competition between the three has lead to an exponential scale in a graphics ceiling, as well as a demand for more memory-applying games. While the 'console wars' had been going on, the game content producers had had their hands full keeping up, resulting in progressive trials of improvement in games.
With the current trend still going, and the Playstation 4 released today, the Xbox One (expected Nov. 22, 2013) and Wii U (2012) will likely keep improving their own system specs, attempting to boast the higher gaming experience. As consoles such as the Xbox One include features such as Skype integration, 8GB RAM, 8 core x86 processor, and voice and movement recognition, games will try and integrate these new features to improve their experience. Considering voice, movement, graphics, and processors have all been added/ upgraded so far, what will newer generation consoles add next?
This generation of consoles has a clear advantage over past generations, so it's depressing that so many people have already passed judgment on them despite the fact that they have just recently been released. People mocked the Xbox One for being a media player first and a game console second. Especially because of this, there has been a notable amount of back-lash in forums to just buy a good gaming PC and stop supporting the "console-war". I never felt the validity of these people's "arguments" though because the consoles are dedicated machines. While it is already possible to build computers with specifications greater than the newest set of consoles, the dedicated designs of the consoles makes them stronger for less money than an "equal" gaming computer would.
ReplyDelete