8:50 PM - April 14, 2011 by Douglas Perry -
source: Acer
Acer announced two new 3D LCDs.
ZoomBoth monitors run at resolution of 1920x1080 and support 3D via HDMI or DVI-DL to PCs as well as Blu-ray players, game consoles, cameras and 3D TV programming via a set-top box.
The 27-incher (HN274H) supports Nvidia 3D Vision and comes with a pair of Nvidia's 3D vision glasses by default. The 23.6-inch model (HS244HQ) does not include Nvidia 3D vision support and instead ships with Acer's own 3D glasses. The Nvidia transmitter is not required as the transmitter is built into the displays. According to Acer, any other type of active shutter glasses works with its new displays.
The 27-inch version provides a dynamic contrast ratio of 100 million:1, while the smaller display is rated at 12 million:1. Both displays use LED backlights and offer a response time of 2 ms, Acer said. Prices start at $449 for the 23.6-inch LCD. the 27-inch version carries an MSRP of $689.
Friday, 15 April 2011
E3 Rumor: Powerful Wii HD with HD Controller
11:20 PM - April 15, 2011 by Jane McEntegart -
source: Tom's Hardware US
A deluge of Wii 2 rumors with your coffee, sir?
Zoom
Earlier in the week it was reported that Nintendo would be dropping the price of its motion-sensing Wii console to $150, bringing the cost down a full hundred dollars from the launch price. Though it was hard to argue with the logic (the Wii is five years old at this stage, and sales are declining), rumor had the price cut scheduled for May 15, just three weeks before E3. Many wondered why Nintendo would make that kind of announcement so close to E3. Why not just wait a few weeks and announce it at the convention?
Well, today’s rumors lend credence to reports of a price cut while also providing a reason for not waiting until E3 to announce it: Nintendo is planning a Wii 2 launch for E3. That’s the latest. According to sources from both IGN and Game Informer, the device will be revealed at or before E3 (which runs June 7 to 9), and will support HD. Though GI’s sources couldn’t agree on how the graphics would compare to that of the Xbox 360 or the PS3, IGN’s sources say "significantly more powerful than the PlayStation 3 and Xbox 360." These same sources also revealed that the new Wii will be backwards compatible with older Wii games.
Throwing yet more turf on the Fire of Hope is a report from CVG claiming that 1080p is not the only thing the new Wii will have going for it. Nope, Computerandvideogames.com reports that the new Wii will also come with a new controller. Sources say the controller will be quite different to the original in that it will also feature an integrated HD display. CVG’s sources also confirmed IDG’s report regarding the power of the console and backwards compatibility.
"Nintendo's plans sound unreal," one source said. "Publishers are already planning launch titles and it's all very exciting.
"The hardware is even more powerful than current HD consoles and backwards compatible with Wii.”
All three publications are saying it’s going to be an E3 reveal but a 2012 launch.
So, who’s excited?
source: Tom's Hardware US
A deluge of Wii 2 rumors with your coffee, sir?
Zoom
Earlier in the week it was reported that Nintendo would be dropping the price of its motion-sensing Wii console to $150, bringing the cost down a full hundred dollars from the launch price. Though it was hard to argue with the logic (the Wii is five years old at this stage, and sales are declining), rumor had the price cut scheduled for May 15, just three weeks before E3. Many wondered why Nintendo would make that kind of announcement so close to E3. Why not just wait a few weeks and announce it at the convention?
Well, today’s rumors lend credence to reports of a price cut while also providing a reason for not waiting until E3 to announce it: Nintendo is planning a Wii 2 launch for E3. That’s the latest. According to sources from both IGN and Game Informer, the device will be revealed at or before E3 (which runs June 7 to 9), and will support HD. Though GI’s sources couldn’t agree on how the graphics would compare to that of the Xbox 360 or the PS3, IGN’s sources say "significantly more powerful than the PlayStation 3 and Xbox 360." These same sources also revealed that the new Wii will be backwards compatible with older Wii games.
Throwing yet more turf on the Fire of Hope is a report from CVG claiming that 1080p is not the only thing the new Wii will have going for it. Nope, Computerandvideogames.com reports that the new Wii will also come with a new controller. Sources say the controller will be quite different to the original in that it will also feature an integrated HD display. CVG’s sources also confirmed IDG’s report regarding the power of the console and backwards compatibility.
"Nintendo's plans sound unreal," one source said. "Publishers are already planning launch titles and it's all very exciting.
"The hardware is even more powerful than current HD consoles and backwards compatible with Wii.”
All three publications are saying it’s going to be an E3 reveal but a 2012 launch.
So, who’s excited?
Friday, 25 March 2011
Nvidia GeForce GTX 590 3 GB Review: Firing Back With 1024 CUDA Cores
8:00 AM - March 24, 2011 by Chris Angelini
AMD shot for—and successfully achieved—the coveted “fastest graphics card in the world” title with its Radeon HD 6990. Now, Nvidia is gunning for that freshly-claimed honor with a dual-GF110-powered board that speaks softly and carries a big stick.
In this corner...
Today, the worst-kept secret in technology officially gets the spotlight. Hot on the heels of AMD’s Radeon HD 6990 4 GB introduction three weeks ago, Nvidia is following up with its GeForce GTX 590 3 GB. According to Nvidia, it could have introduced this card more than a month ago. However, we know it continued revising its plans for a new flagship well into March. The result is a board deliberately intended to emphasize elegance, immediately after the Radeon HD 6990 bludgeoned us over the head with abrasive acoustics.
Pursuing quietness might sound ironic, given that GPUs based on Nvidia’s Fermi architecture are notoriously hot and power-hungry. To think the company could put two on a single PCB and not out-scream AMD’s dual-Cayman-based card is almost ludicrous. And yet, that’s what Nvidia says it did.
It admits that getting there wasn’t an easy task, though. Compromises were made. For example, Nvidia uses the same mid-mounted fan design for which we chided AMD. It dropped the clocks on its GPUs to help keep thermals under control. And the card still uses more power than any graphics product we’ve ever tested.
And in the other corner...
But it’s quiet. Crazy-freaking quiet. The quietest dual-GPU board I’ve tested since ATI’s Rage Fury Maxx (how’s that for back-in-the-day?). Mission accomplished on that front. The question remains, though: was Nvidia forced to give up the farm just to show AMD that hot cards don't have to make lots of noise?
Under The Hood: Dual GF110s, Both Uncut
In my discussions with Nvidia, the company made it clear that it wanted to use two GF110 processors, and it didn’t want to hack them up. Uncut GF110s, as you probably already know from reading GeForce GTX 580 And GF110: The Way Nvidia Meant It To Be Played, employ four Graphics Processing Clusters, each with four Streaming Multiprocessors. You’ll find 32 CUDA cores in each SM, totaling 512 cores per GPU. Each SM also offers four texturing units, yielding 64 across the entire chip. Of course, there’s one Polymorph engine per SM as well, though as we’ve seen in the past, Nvidia’s approach to parallelizing geometry doesn’t necessarily scale very well.
As in our GTX 580 review, GF110 doesn't get cut-back here
The GPU’s back-end features six ROP partitions, each capable of outputting eight 32-bit integer pixels at a time, adding up to 48 pixels per clock. An aggregate 384-bit memory bus is divisible into a sextet of 64-bit interfaces, and you’ll find 256 MB of GDDR5 memory at all six stops. That adds up to 1.5 GB of memory per GPU, which is how you arrive at the GeForce GTX 590’s 3 GB.
Nvidia ties GTX 590’s GF110 processors together using its own NF200 bridge, which takes a single 16-lane PCI Express 2.0 interface and multiplexes it out to two 16-lane paths—one for each GPU.
What changed from the ill-received GF100-based GeForce GTX 480 to GF110? From my GeForce GTX 580 review:
“The GPU itself is largely the same. This isn’t a GF100 to GF104 sort of change, where Shader Multiprocessors get reoriented to improve performance at mainstream price points (read: more texturing horsepower). The emphasis here remains compute muscle. Really, there are only two feature changes: full-speed FP16 filtering and improved Z-cull efficiency.
GF110 can perform FP16 texture filtering in one clock cycle (similar to GF104), while GF100 required two cycles. In texturing-limited applications, this speed-up may translate into performance gains. The culling improvements give GF110 an advantage in titles that suffer lots of overdraw, helping maximize available memory bandwidth. On a clock-for-clock basis, Nvidia claims these enhancements have up to a 14% impact (or so).”
That's a 12-layer PCB with 10-phase power, and NF200 in the middle
Other than that, we’re still talking about two pieces of silicon manufactured on TSMC’s 40 nm node and composed of roughly 3 billion transistors each. At 520 square millimeters, GF110 is substantially larger than AMD’s Cayman processor, which measures 389 mm² and is made up of 2.64 billion transistors.
Now, it’s great to get all of those resources (times two) on GeForce GTX 590. However, while the GeForce GTX 580 employs a 772 MHz graphics clock and 1002 MHz memory clock, the GPUs on GTX 590 slow things down to 607 MHz and 853 MHz, respectively.
As a result, this card’s performance isn’t anywhere near what you’d expect from two of Nvidia’s fastest single-GPU flagships. That might be alright, though. After all, AMD launched Radeon HD 6970 as a GeForce GTX 570-contender; the 580 sat in a league of its own. So, although AMD’s Radeon HD 6990 comes very close to doubling the performance of the company’s quickest single-GPU cards, GeForce GTX 590 doesn’t have to do the same thing in order to be competitive at the $700 price point AMD already established and Nvidia plans to match.
We already know what AMD had to do in order to deliver “the fastest graphics card in the world.” Now, how does Nvidia counter?
Read Full Here
AMD shot for—and successfully achieved—the coveted “fastest graphics card in the world” title with its Radeon HD 6990. Now, Nvidia is gunning for that freshly-claimed honor with a dual-GF110-powered board that speaks softly and carries a big stick.
In this corner...
Today, the worst-kept secret in technology officially gets the spotlight. Hot on the heels of AMD’s Radeon HD 6990 4 GB introduction three weeks ago, Nvidia is following up with its GeForce GTX 590 3 GB. According to Nvidia, it could have introduced this card more than a month ago. However, we know it continued revising its plans for a new flagship well into March. The result is a board deliberately intended to emphasize elegance, immediately after the Radeon HD 6990 bludgeoned us over the head with abrasive acoustics.
Pursuing quietness might sound ironic, given that GPUs based on Nvidia’s Fermi architecture are notoriously hot and power-hungry. To think the company could put two on a single PCB and not out-scream AMD’s dual-Cayman-based card is almost ludicrous. And yet, that’s what Nvidia says it did.
It admits that getting there wasn’t an easy task, though. Compromises were made. For example, Nvidia uses the same mid-mounted fan design for which we chided AMD. It dropped the clocks on its GPUs to help keep thermals under control. And the card still uses more power than any graphics product we’ve ever tested.
And in the other corner...
But it’s quiet. Crazy-freaking quiet. The quietest dual-GPU board I’ve tested since ATI’s Rage Fury Maxx (how’s that for back-in-the-day?). Mission accomplished on that front. The question remains, though: was Nvidia forced to give up the farm just to show AMD that hot cards don't have to make lots of noise?
Under The Hood: Dual GF110s, Both Uncut
In my discussions with Nvidia, the company made it clear that it wanted to use two GF110 processors, and it didn’t want to hack them up. Uncut GF110s, as you probably already know from reading GeForce GTX 580 And GF110: The Way Nvidia Meant It To Be Played, employ four Graphics Processing Clusters, each with four Streaming Multiprocessors. You’ll find 32 CUDA cores in each SM, totaling 512 cores per GPU. Each SM also offers four texturing units, yielding 64 across the entire chip. Of course, there’s one Polymorph engine per SM as well, though as we’ve seen in the past, Nvidia’s approach to parallelizing geometry doesn’t necessarily scale very well.
As in our GTX 580 review, GF110 doesn't get cut-back here
The GPU’s back-end features six ROP partitions, each capable of outputting eight 32-bit integer pixels at a time, adding up to 48 pixels per clock. An aggregate 384-bit memory bus is divisible into a sextet of 64-bit interfaces, and you’ll find 256 MB of GDDR5 memory at all six stops. That adds up to 1.5 GB of memory per GPU, which is how you arrive at the GeForce GTX 590’s 3 GB.
Nvidia ties GTX 590’s GF110 processors together using its own NF200 bridge, which takes a single 16-lane PCI Express 2.0 interface and multiplexes it out to two 16-lane paths—one for each GPU.
GeForce GTX 590 | GeForce GTX 580 | Radeon HD 6990 | Radeon HD 6970 | Radeon HD 6950 | |
---|---|---|---|---|---|
Manufacturing Process | 40 nm TSMC | 40 nm TSMC | 40 nm TSMC | 40 nm TSMC | 40 nm TSMC |
Die Size | 2 x 520 mm² | 520 mm² | 2 x 389 mm² | 389 mm² | 389 mm² |
Transistors | 2 x 3 billion | 3 billion | 2 x 2.64 billion | 2.64 billion | 2.64 billion |
Engine Clock | 607 MHz | 772 MHz | 830 MHz | 880 MHz | 800 MHz |
Stream Processors / CUDA Cores | 1024 | 512 | 3072 | 1536 | 1408 |
Compute Performance | 2.49 TFLOPS | 1.58 TFLOPS | 5.1 TFLOPS | 2.7 TFLOPS | 2.25 TFLOPS |
Texture Units | 128 | 64 | 192 | 96 | 88 |
Texture Fillrate | 77.7 Gtex/s | 49.4 Gtex/s | 159.4 Gtex/s | 84.5 Gtex/s | 70.4 Gtex/s |
ROPs | 96 | 48 | 64 | 32 | 32 |
Pixel Fillrate | 58.3 Gpix/s | 37.1 Gpix/s | 53.1 Gpix/s | 28.2 Gpix/s | 25.6 Gpix/s |
Frame Buffer | 2 x 1.5 GB GDDR5 | 1.5 GB GDDR5 | 2 x 2 GB GDDR5 | 2 GB GDDR5 | 2 GB GDDR5 |
Memory Clock | 853 MHz | 1002 MHz | 1250 MHz | 1375 MHz | 1250 MHz |
Memory Bandwidth | 2 x 163.9 GB/s (384-bit) | 192 GB/s (384-bit) | 2 x 160 GB/s (256-bit) | 176 GB/s (256-bit) | 160 GB/s (256-bit) |
Maximum Board Power | 365 W | 244 W | 375 W | 250 W | 200 W |
What changed from the ill-received GF100-based GeForce GTX 480 to GF110? From my GeForce GTX 580 review:
“The GPU itself is largely the same. This isn’t a GF100 to GF104 sort of change, where Shader Multiprocessors get reoriented to improve performance at mainstream price points (read: more texturing horsepower). The emphasis here remains compute muscle. Really, there are only two feature changes: full-speed FP16 filtering and improved Z-cull efficiency.
GF110 can perform FP16 texture filtering in one clock cycle (similar to GF104), while GF100 required two cycles. In texturing-limited applications, this speed-up may translate into performance gains. The culling improvements give GF110 an advantage in titles that suffer lots of overdraw, helping maximize available memory bandwidth. On a clock-for-clock basis, Nvidia claims these enhancements have up to a 14% impact (or so).”
That's a 12-layer PCB with 10-phase power, and NF200 in the middle
Other than that, we’re still talking about two pieces of silicon manufactured on TSMC’s 40 nm node and composed of roughly 3 billion transistors each. At 520 square millimeters, GF110 is substantially larger than AMD’s Cayman processor, which measures 389 mm² and is made up of 2.64 billion transistors.
Now, it’s great to get all of those resources (times two) on GeForce GTX 590. However, while the GeForce GTX 580 employs a 772 MHz graphics clock and 1002 MHz memory clock, the GPUs on GTX 590 slow things down to 607 MHz and 853 MHz, respectively.
As a result, this card’s performance isn’t anywhere near what you’d expect from two of Nvidia’s fastest single-GPU flagships. That might be alright, though. After all, AMD launched Radeon HD 6970 as a GeForce GTX 570-contender; the 580 sat in a league of its own. So, although AMD’s Radeon HD 6990 comes very close to doubling the performance of the company’s quickest single-GPU cards, GeForce GTX 590 doesn’t have to do the same thing in order to be competitive at the $700 price point AMD already established and Nvidia plans to match.
We already know what AMD had to do in order to deliver “the fastest graphics card in the world.” Now, how does Nvidia counter?
Read Full Here
AMD: DirectX Comments Taken Out of Context
Just over a week after AMD's worldwide developer relations manager of its GPU division, Richard Huddy, spoke out against DirectX and other APIs, the company now says that it supports DirectX and that the previous comments were taken out of context and exaggerated. While that may be true, Huddy's latest interview with CRN-- along with senior director of ISV relations at AMD Neal Robison--also comes across as damage control.
"The [Bit-tech] interview started off being about OpenGL, and the way APIs are developed," Huddy said. "Obviously there’s pressure from Microsoft on hardware vendors to develop DirectX in a variety of ways. We spend a great deal of time getting feedback from game developers in the early phase of our hardware development, for products that are two or three years away from going to market."
The previous interview claimed that developers want the API to "go away," that it's getting in the way of creating some truly amazing graphics. Huddy himself was even quoted saying that developers have admitted this in conversations. But in this latest interview, he said that only a handful of high-end gaming developers were looking to bypass DirectX and code directly to hardware.
"It’s not something most developers want," he said. "If you held a vote among developers, they would go for DirectX or OpenGL, because it's a great platform. It’s hard to crash a machine with Direct X, as there’s lots of protection to make sure the game isn’t taking down the machine, which is certainly rare especially compared to ten or fifteen years ago. Stability is the reason why you wouldn’t want to move away from Direct X, and differentiation is why you might want to."
"We saw some of the chaos before DirectX coalesced the industry,” Robison added. "In the past there were all kinds of APIs developers had to worry about."
Later on in the interview, Huddy revealed that there's a division starting to take place in the gaming industry: those that want to stick with DirectX and other APIs, and those that want to move on in another direction. He even provided an example, saying that developers like DICE have highly-tuned, efficient rendering machines that rely on DirectX. Then there are developers like Crytek who literally sell hardware because they seemingly develop for technologies in the future, and could actually bypass an API.
"Many people are still shipping DirectX 9 games, which is still a perfectly reasonable way to go," Huddy admitted. "As hardware vendors we want to keep bringing out new hardware that produces something visually exciting. We want to be able to innovate. In the feedback we’re getting, some say 'move on from Direct X' and some say 'DX is absolutely the right place to play.'"
He also said that the comment about developers wanting the API to "go away" shouldn't be taken literally. Instead, APIs and middleware need to be innovative and adapt with evolving software code as well as GPU hardware, essentially taking "a different form."
Unlike the first interview, Huddy's follow-up to the Bit-Tech interview is rather lengthy. To get the full four-page dose, head here.
Ubisoft: 3DS Can Handle Unreal Engine 2
5:30 PM - March 25, 2011 - By Kevin Parrish -
Source : Gamespot UK
Source : Gamespot UK
Ubisoft is reportedly using Epic's Unreal Engine 2 for Splinter Cell 3DS.
So just how powerful is Nintendo's upcoming 3DS handheld gaming system? According to Epic Games' Mark Rein, there aren't enough horses under the hood to run the company’s more recent Unreal Engine 3. In fact, the iPhone 3GS, iPhone 4, iPod Touch 3 and 4, and a number of recent Android smartpohnes can run the engine without a hitch. Nintendo's new 3DS system apparently cannot.
"There's nothing against Nintendo," he said during GDC 2011. "I hate that people somehow think that's the case. If we felt it could run [Unreal Engine] and deliver the kind of experience people license our technology to build, we'd be on [the 3DS]. There's only so much time in the day; our engine requires a certain level of hardware capabilities to make our pipeline, our tools work -- and we work on the ones that do. The second Nintendo releases a piece of hardware that can run our engine well, we'll be on it like water on fish."
However in an interview with Gamepot UK, Ubisoft's Fabrice Cuny claims that Tom Clancy's Splinter Cell 3DS is running on the Nintendo 3DS via Epic's Unreal Engine 2. "The 3DS is powerful, and we are able to run the Unreal Engine on this console, which is pretty impressive for a handheld machine, and the 3D doesn’t affect the performance (thanks to my amazing programmers)," he told the site. "The architecture is different compared to a Wii or some other platforms that we had to work with here at Ubisoft Montreal."
Fabrice added that the 3DS can be much more comparable to a platform between a DSi and a Wii. "We are able to create games anywhere from a puzzle game to very high-end game such as Splinter Cell 3DS," he said. "The tools on the 3DS were brand new, and with every development phase, we had some tools with bugs and crashes. But with version after version, Nintendo provided us a set of tools and the support to help us debug and optimize the game."
So if the 3DS can't run games using the Unreal Engine 3, but can do so with the previous engine, what does that mean for gamers? Look at it this way: if Epic were to bring the Unreal Tournament franchise to Nintendo's handheld, the device may have the ability to support the original Unreal Tournament, Unreal Tournament 2003 and Unreal Tournament 2004. It wouldn't be able to handle Unreal Tournament 3.
Of course, that's just an example, and doesn't mean any Unreal Tournament title will be ported to the 3DS. It's also currently unknown if the 3DS can handle the Unreal Engine in a first-person perspective (running at an acceptable frame rate). Even more, Epic has also tweaked its Unreal Engine 3 to work on iOS and Android platforms, so it's likely Ubisoft did the same for Unreal Engine 2.
The Nintendo 3DS launches here in the States this Sunday at 12 am EST. Various Best Buy locations will host a launch party starting Saturday at 9pm EST. Check your local store for details.
So just how powerful is Nintendo's upcoming 3DS handheld gaming system? According to Epic Games' Mark Rein, there aren't enough horses under the hood to run the company’s more recent Unreal Engine 3. In fact, the iPhone 3GS, iPhone 4, iPod Touch 3 and 4, and a number of recent Android smartpohnes can run the engine without a hitch. Nintendo's new 3DS system apparently cannot.
"There's nothing against Nintendo," he said during GDC 2011. "I hate that people somehow think that's the case. If we felt it could run [Unreal Engine] and deliver the kind of experience people license our technology to build, we'd be on [the 3DS]. There's only so much time in the day; our engine requires a certain level of hardware capabilities to make our pipeline, our tools work -- and we work on the ones that do. The second Nintendo releases a piece of hardware that can run our engine well, we'll be on it like water on fish."
However in an interview with Gamepot UK, Ubisoft's Fabrice Cuny claims that Tom Clancy's Splinter Cell 3DS is running on the Nintendo 3DS via Epic's Unreal Engine 2. "The 3DS is powerful, and we are able to run the Unreal Engine on this console, which is pretty impressive for a handheld machine, and the 3D doesn’t affect the performance (thanks to my amazing programmers)," he told the site. "The architecture is different compared to a Wii or some other platforms that we had to work with here at Ubisoft Montreal."
Fabrice added that the 3DS can be much more comparable to a platform between a DSi and a Wii. "We are able to create games anywhere from a puzzle game to very high-end game such as Splinter Cell 3DS," he said. "The tools on the 3DS were brand new, and with every development phase, we had some tools with bugs and crashes. But with version after version, Nintendo provided us a set of tools and the support to help us debug and optimize the game."
So if the 3DS can't run games using the Unreal Engine 3, but can do so with the previous engine, what does that mean for gamers? Look at it this way: if Epic were to bring the Unreal Tournament franchise to Nintendo's handheld, the device may have the ability to support the original Unreal Tournament, Unreal Tournament 2003 and Unreal Tournament 2004. It wouldn't be able to handle Unreal Tournament 3.
Of course, that's just an example, and doesn't mean any Unreal Tournament title will be ported to the 3DS. It's also currently unknown if the 3DS can handle the Unreal Engine in a first-person perspective (running at an acceptable frame rate). Even more, Epic has also tweaked its Unreal Engine 3 to work on iOS and Android platforms, so it's likely Ubisoft did the same for Unreal Engine 2.
The Nintendo 3DS launches here in the States this Sunday at 12 am EST. Various Best Buy locations will host a launch party starting Saturday at 9pm EST. Check your local store for details.
Saturday, 19 March 2011
Crysis 2 Performance Previewed And Analyzed
12:00 AM - March 18, 2011 by Don Woligroski
A year is an eternity when it comes to the ever-changing world of PC graphics technology. It is, therefore, a testament to the developers at Crytek that the original Crysis, released November 2007 (more than three years ago), continues to set the bar for PC game graphics. This title created a standard so lofty that we continue to get requests for benchmarks in Crysis in our graphics card reviews, more than three years later.
In the final analysis, Crysis was probably more successful at showing off what PC graphics can do than it was at being a great game.
But Crytek has been far from idle for the last three years, and Crysis 2 is about to hit store shelves on the 22nd of March. Happily, the company gave us a chance to experiment with the game via a free multiplayer demo (that demo that is no longer playable, by the way; Crytek disabled it on March 16th). Because of this, we’re able to provide you with detailed information regarding graphics card performance in Crysis 2.
Before we look at that data, let’s discuss the gameplay aspect. Crytek went back to the drawing board with the multiplayer component of Crysis 2, and it’s clear the company paid a lot of attention to the Call of Duty series. Crysis 2 tracks kills and unlocks ranks and achievements in a very similar fashion. Even the feel is similar.
But Crytek’s newest title is so much more than a Call of Duty clone. Of course, the nanosuit’s strength, speed, armor, and cloaking capabilities remain, but the addition of a new ‘nanovision’ mode helps you see other combatants and even cloaked enemies. And all of these wonderful abilities come with an associated energy cost. This adds a whole other dimension to the standard first-person shooter combat formula.
Yes, the controls have been streamlined, but not necessarily in a bad way. It’s easy to point a finger and say the game is dumbed-down for consoles. But in practice, the new scheme makes much more sense. The default mode is strength and speed, but these abilities don’t take any energy unless you use them by jumping or running. Armor and stealth modes can be toggled with the Q and E keys, respectively, but enabling either of these modes will constantly consume energy. Nanovision mode also eats energy, but at a much slower rate than armor or stealth. Energy management is key, and the most successful players are the ones who do that effectively. The simple-but-sensible control scheme helps with that.
The two maps included in the demo are Skyline and Pier 7, both of which are just the right size for a team deathmatch of eight to 12 players. But there’s also a new game mode called ‘capture the pod.’ an alien ship drops an item, and the team that occupies the area surrounding it gains points over time. After a couple minutes, the pod becomes unstable and explodes, and this sequence of events repeats until one of the teams has gained enough points to win. It’s a good metagame alternative to simple team deathmatch.
And that’s about it. Crysis 2 multiplayer might not sound groundbreaking, but it’s certainly very addicting. In this author’s opinion, it contains the best PvP elements of Call of Duty and Aliens vs. Predator, but ends up being more fun and challenging than both.
With no single-player demo for us to try, that’s as much as we can say about the game play until we get our hands on the full release. Now let’s talk about performance.
Read More
A year is an eternity when it comes to the ever-changing world of PC graphics technology. It is, therefore, a testament to the developers at Crytek that the original Crysis, released November 2007 (more than three years ago), continues to set the bar for PC game graphics. This title created a standard so lofty that we continue to get requests for benchmarks in Crysis in our graphics card reviews, more than three years later.
In the final analysis, Crysis was probably more successful at showing off what PC graphics can do than it was at being a great game.
But Crytek has been far from idle for the last three years, and Crysis 2 is about to hit store shelves on the 22nd of March. Happily, the company gave us a chance to experiment with the game via a free multiplayer demo (that demo that is no longer playable, by the way; Crytek disabled it on March 16th). Because of this, we’re able to provide you with detailed information regarding graphics card performance in Crysis 2.
The Crysis 2 Multiplayer Demo
Before we look at that data, let’s discuss the gameplay aspect. Crytek went back to the drawing board with the multiplayer component of Crysis 2, and it’s clear the company paid a lot of attention to the Call of Duty series. Crysis 2 tracks kills and unlocks ranks and achievements in a very similar fashion. Even the feel is similar.
But Crytek’s newest title is so much more than a Call of Duty clone. Of course, the nanosuit’s strength, speed, armor, and cloaking capabilities remain, but the addition of a new ‘nanovision’ mode helps you see other combatants and even cloaked enemies. And all of these wonderful abilities come with an associated energy cost. This adds a whole other dimension to the standard first-person shooter combat formula.
Yes, the controls have been streamlined, but not necessarily in a bad way. It’s easy to point a finger and say the game is dumbed-down for consoles. But in practice, the new scheme makes much more sense. The default mode is strength and speed, but these abilities don’t take any energy unless you use them by jumping or running. Armor and stealth modes can be toggled with the Q and E keys, respectively, but enabling either of these modes will constantly consume energy. Nanovision mode also eats energy, but at a much slower rate than armor or stealth. Energy management is key, and the most successful players are the ones who do that effectively. The simple-but-sensible control scheme helps with that.
The two maps included in the demo are Skyline and Pier 7, both of which are just the right size for a team deathmatch of eight to 12 players. But there’s also a new game mode called ‘capture the pod.’ an alien ship drops an item, and the team that occupies the area surrounding it gains points over time. After a couple minutes, the pod becomes unstable and explodes, and this sequence of events repeats until one of the teams has gained enough points to win. It’s a good metagame alternative to simple team deathmatch.
And that’s about it. Crysis 2 multiplayer might not sound groundbreaking, but it’s certainly very addicting. In this author’s opinion, it contains the best PvP elements of Call of Duty and Aliens vs. Predator, but ends up being more fun and challenging than both.
With no single-player demo for us to try, that’s as much as we can say about the game play until we get our hands on the full release. Now let’s talk about performance.
Read More
Half Of All Notebooks To Use gCPUs This Year
6:00 PM - March 18, 2011 by Douglas Perry -
source: Tom's Hardware US
he introduction of Intel's Sandy Bridge and AMD's Fusion processor will dramatically increase the penetration of graphics-enabled CPUs (gCPUs), market research firm IHS iSuppli said today.
ZoomAccording to a new forecast, 50% of notebooks and 45% of desktops will use gCPUs in 2011, up from 39% and 36%, respectively. By 2014, 83% of notebooks will use gCPUs with integrated graphics processors, the share of desktop PCs will hit 76%, the firm said. "With GEMs [graphics enabled microprocessors] capable of generating the total graphic output of a PC, no additional graphics processor or add-in graphics card is needed," said said Peter Lin, principal analyst for compute platforms at IHS. "Computers today are serving up ever-richer multimedia experiences, so the graphics capabilities of PCs have become more important, driving the rising penetration of GEMs."
The obvious question would be what the effect on discrete graphics cards may be, even if AMD is unlikely to torpedo the demand for its own products. IHS noted that "discrete graphics cards will remain the solution of choice for leading-edge graphics, providing high-end performance for applications such as games." GEMs, as far as their graphics capability is concerned, are likely to be targeted especially at mainstream and value PCs, IHS said.
Both AMD and Intel are positioning their gCPUs as a way to reduce the manufacturing cost of their chip solutions as well as a way to reduce the influence of third-party manufacturers within their platform environments as many users will perceive embedded graphics solutions as good enough for their purposes. While Intel is relying on a single general gCPU approach, AMD is expected to release five application platforms with five GEM microprocessor categories.
Via is also part of the game, but caters with its gCPU solutions to embedded and industrial applications, IHS iSuppli said.
source: Tom's Hardware US
he introduction of Intel's Sandy Bridge and AMD's Fusion processor will dramatically increase the penetration of graphics-enabled CPUs (gCPUs), market research firm IHS iSuppli said today.
ZoomAccording to a new forecast, 50% of notebooks and 45% of desktops will use gCPUs in 2011, up from 39% and 36%, respectively. By 2014, 83% of notebooks will use gCPUs with integrated graphics processors, the share of desktop PCs will hit 76%, the firm said. "With GEMs [graphics enabled microprocessors] capable of generating the total graphic output of a PC, no additional graphics processor or add-in graphics card is needed," said said Peter Lin, principal analyst for compute platforms at IHS. "Computers today are serving up ever-richer multimedia experiences, so the graphics capabilities of PCs have become more important, driving the rising penetration of GEMs."
The obvious question would be what the effect on discrete graphics cards may be, even if AMD is unlikely to torpedo the demand for its own products. IHS noted that "discrete graphics cards will remain the solution of choice for leading-edge graphics, providing high-end performance for applications such as games." GEMs, as far as their graphics capability is concerned, are likely to be targeted especially at mainstream and value PCs, IHS said.
Both AMD and Intel are positioning their gCPUs as a way to reduce the manufacturing cost of their chip solutions as well as a way to reduce the influence of third-party manufacturers within their platform environments as many users will perceive embedded graphics solutions as good enough for their purposes. While Intel is relying on a single general gCPU approach, AMD is expected to release five application platforms with five GEM microprocessor categories.
Via is also part of the game, but caters with its gCPU solutions to embedded and industrial applications, IHS iSuppli said.
AMD: DirectX Holding Back Game Performance
4:50 PM - March 18, 2011 by Kevin Parrish -
source: Bit-Tech
AMD claims that game developers actually want the API to go away.
ZoomWith all the hype surrounding DirectX 11 and how it's changing the face of PC gaming in regards to mind-blowing eye candy, AMD's worldwide developer relations manager of its GPU division, Richard Huddy, claims that developers actually want the API to go away, that it's getting in the way of creating some truly amazing graphics.
"I certainly hear this in my conversations with games developers," he told Bit-Tech in an interview. "And I guess it was actually the primary appeal of Larrabee to developers – not the hardware, which was hot and slow and unimpressive, but the software – being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft – no doubt at all."
Outside a few current developers who have announced that PC game development will take priority over console versions, a good chunk of the gaming industry is developing titles for the Xbox 360 and PlayStation 3 first and then porting them over to the PC thereafter. The result is that PC versions are only slightly superior to their console counterparts on a visual sense even though a high-end graphics card has at least ten times the horsepower of the Xbox 360's Xenos GPU and the PlayStation 3's GeForce 7-series architecture.
What this means is that-- although PC graphics are better than the console version-- developers can't tap into the PC's true potential because they can't program hardware directly at a low-level, forced to work through DirectX instead. But there are benefits to working with APIs including the ability to develop a game that will run on a wide range of hardware. Developers also get access to the latest shader technologies without having to work with low-level code.
But according to Huddy, the performance overhead of DirectX is a frustrating concern for developers. "Wrapping it up in a software layer gives you safety and security," he said. "But it unfortunately tends to rob you of quite a lot of the performance, and most importantly, it robs you of the opportunity to innovate."
He added that shaders, which were introduced back in 2002, were designed to allow developers to be more innovative, to create a more visual variety in games. But now many PC games have the same kind of look and feel because developers are using shaders "to converge visually."
"If we drop the API, then people really can render everything they can imagine, not what they can see – and we'll probably see more visual innovation in that kind of situation."
The interview goes on to define the performance overhead of DirectX, explaining that the actual amount depends on the type of game in development. Huddy also talks about the possible problems of developing for a multiple GPU architecture on a low-level if the API is ignored.
"The problem with the PC is that you ideally want a PC that doesn't crash too much, and if a games developer is over-enthusiastic about the way they program direct to the metal, they can produce all sorts of difficulties for us as a hardware company trying to keep the PC stable," he said.
The interview is definitely an awesome read, so head here to get the full scoop.
source: Bit-Tech
AMD claims that game developers actually want the API to go away.
ZoomWith all the hype surrounding DirectX 11 and how it's changing the face of PC gaming in regards to mind-blowing eye candy, AMD's worldwide developer relations manager of its GPU division, Richard Huddy, claims that developers actually want the API to go away, that it's getting in the way of creating some truly amazing graphics.
"I certainly hear this in my conversations with games developers," he told Bit-Tech in an interview. "And I guess it was actually the primary appeal of Larrabee to developers – not the hardware, which was hot and slow and unimpressive, but the software – being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft – no doubt at all."
Outside a few current developers who have announced that PC game development will take priority over console versions, a good chunk of the gaming industry is developing titles for the Xbox 360 and PlayStation 3 first and then porting them over to the PC thereafter. The result is that PC versions are only slightly superior to their console counterparts on a visual sense even though a high-end graphics card has at least ten times the horsepower of the Xbox 360's Xenos GPU and the PlayStation 3's GeForce 7-series architecture.
What this means is that-- although PC graphics are better than the console version-- developers can't tap into the PC's true potential because they can't program hardware directly at a low-level, forced to work through DirectX instead. But there are benefits to working with APIs including the ability to develop a game that will run on a wide range of hardware. Developers also get access to the latest shader technologies without having to work with low-level code.
But according to Huddy, the performance overhead of DirectX is a frustrating concern for developers. "Wrapping it up in a software layer gives you safety and security," he said. "But it unfortunately tends to rob you of quite a lot of the performance, and most importantly, it robs you of the opportunity to innovate."
He added that shaders, which were introduced back in 2002, were designed to allow developers to be more innovative, to create a more visual variety in games. But now many PC games have the same kind of look and feel because developers are using shaders "to converge visually."
"If we drop the API, then people really can render everything they can imagine, not what they can see – and we'll probably see more visual innovation in that kind of situation."
The interview goes on to define the performance overhead of DirectX, explaining that the actual amount depends on the type of game in development. Huddy also talks about the possible problems of developing for a multiple GPU architecture on a low-level if the API is ignored.
"The problem with the PC is that you ideally want a PC that doesn't crash too much, and if a games developer is over-enthusiastic about the way they program direct to the metal, they can produce all sorts of difficulties for us as a hardware company trying to keep the PC stable," he said.
The interview is definitely an awesome read, so head here to get the full scoop.
Saturday, 12 March 2011
Watch the Next-Generation Unreal Graphics NOW
7:00 AM - March 11, 2011 by Marcus Yam -
source: Tom's Hardware US
source: Tom's Hardware US
Epic Games is now more than just a games developer – it's an engine technology maker that licenses its software to other developers to help designers make the games rather than the tools.
At GDC, Epic showed off the most bleeding-edge version of Unreal Engine 3. It was so advanced that it could be called Unreal Engine 4 -- except that it runs all on present-day hardware. Of course, it took no less than three GeForce GTX 580 in SLI to get this done, but it's amazing to think that something like this is possible with readily available hardware. Now all we need is the software.
You've seen the screenshots; now feast your eyes on it in motion:
And just in case you wanted to see some of the high-res stills, here are the screenshots once again:
Zoom
At GDC, Epic showed off the most bleeding-edge version of Unreal Engine 3. It was so advanced that it could be called Unreal Engine 4 -- except that it runs all on present-day hardware. Of course, it took no less than three GeForce GTX 580 in SLI to get this done, but it's amazing to think that something like this is possible with readily available hardware. Now all we need is the software.
You've seen the screenshots; now feast your eyes on it in motion:
Watch Epic's Mind Blowing Graphics in Video
Zoom
Toyota, Honda, Nissan Close Factories After Quake
4:00 AM - March 12, 2011 - By Jane McEntegart -
Source : Tom's Guide US
Source : Tom's Guide US
Zoom
Japan today experienced one of the most powerful earthquakes in its history. Though the extent of the damage is still not known, companies with factories based in the country are starting damage assessments and some have opted to shut down operations as a result of the destruction.
Bloomberg reports that Sony, Toyota, Honda, Nissan and Subaru (Fuji Heavy Industries) have stopped production in several of their factories after the 8.9 magnitude earthquake. Sony has stopped production and evacuated six of its facilities, Toyota has closed three, Honda has shut two, Nissan has closed four, and Fuji Heavy Industries, the maker of Subaru cars, has closed five factories.
Many other companies are still busy assessing the damage from the earthquake. Panasonic reports that several of its employees sustained minor injuries, while one man working for Honda was killed at the carmaker’s R&D facility when a wall fell on him. Two workers at Nissan suffered minor injuries. A Toyota spokesperson told Bloomberg that all workers were safe.
Google today launched a Person Finder to help people in Japan locate missing friends and relatives.
Read more about the effects of the eatherquake, the 7.1 magnitude aftershock and and tsunami on Bloomberg.
Japan today experienced one of the most powerful earthquakes in its history. Though the extent of the damage is still not known, companies with factories based in the country are starting damage assessments and some have opted to shut down operations as a result of the destruction.
Bloomberg reports that Sony, Toyota, Honda, Nissan and Subaru (Fuji Heavy Industries) have stopped production in several of their factories after the 8.9 magnitude earthquake. Sony has stopped production and evacuated six of its facilities, Toyota has closed three, Honda has shut two, Nissan has closed four, and Fuji Heavy Industries, the maker of Subaru cars, has closed five factories.
Many other companies are still busy assessing the damage from the earthquake. Panasonic reports that several of its employees sustained minor injuries, while one man working for Honda was killed at the carmaker’s R&D facility when a wall fell on him. Two workers at Nissan suffered minor injuries. A Toyota spokesperson told Bloomberg that all workers were safe.
Google today launched a Person Finder to help people in Japan locate missing friends and relatives.
Read more about the effects of the eatherquake, the 7.1 magnitude aftershock and and tsunami on Bloomberg.
Sunday, 6 March 2011
Gartner Cuts PC Sales Forecast, Blames Tablets
The company now believes that PC sales will grow only 10.5% in 2011, which is down from a 15.9% prediction from November 29 and 18.1% prior to that.
Instead of the initial forecast of 417 million PCs, Gartner now estimates 2011 unit sales to be in the 388 million range.
The primary reason is apparently a major slowdown in notebook sales. Instead of the 40% growth rates we have seen over the past few years, notebook shipments may only climb by about 10% this year, Gartner said. The reason for that slowdown, according to the market research firm, is that consumers could be delaying new notebook purchases and spend their money on media tablets instead. Gartner expects 54.8 million tablets to be sold this year, up from 19.5 million in 2010.
“We expect growing consumer enthusiasm for mobile PC alternatives, such as the iPad and other media tablets, to dramatically slow home mobile PC sales, especially in mature markets,” said George Shiffler, research director at Gartner. “We once thought that mobile PC growth would continue to be sustained by consumers buying second and third mobile PCs as personal devices. However, we now believe that consumers are not only likely to forgo additional mobile PC buys but are also likely to extend the lifetimes of the mobile PCs they retain as they adopt media tablets and other mobile PC alternatives as their primary mobile device."
There is the obvious question how this trend, if the estimate is somewhat accurate, will affect especially Intel as the company's growth heavily relied on notebook processor sales in the past. We get the sense that the company is still betting on notebook processors as a growth engine, but it is clear that the overall opportunity for increased chip shipments - which is the fundamental business approach of Intel - may be in smartphones and tablets these days.
Instead of the initial forecast of 417 million PCs, Gartner now estimates 2011 unit sales to be in the 388 million range.
The primary reason is apparently a major slowdown in notebook sales. Instead of the 40% growth rates we have seen over the past few years, notebook shipments may only climb by about 10% this year, Gartner said. The reason for that slowdown, according to the market research firm, is that consumers could be delaying new notebook purchases and spend their money on media tablets instead. Gartner expects 54.8 million tablets to be sold this year, up from 19.5 million in 2010.
“We expect growing consumer enthusiasm for mobile PC alternatives, such as the iPad and other media tablets, to dramatically slow home mobile PC sales, especially in mature markets,” said George Shiffler, research director at Gartner. “We once thought that mobile PC growth would continue to be sustained by consumers buying second and third mobile PCs as personal devices. However, we now believe that consumers are not only likely to forgo additional mobile PC buys but are also likely to extend the lifetimes of the mobile PCs they retain as they adopt media tablets and other mobile PC alternatives as their primary mobile device."
There is the obvious question how this trend, if the estimate is somewhat accurate, will affect especially Intel as the company's growth heavily relied on notebook processor sales in the past. We get the sense that the company is still betting on notebook processors as a growth engine, but it is clear that the overall opportunity for increased chip shipments - which is the fundamental business approach of Intel - may be in smartphones and tablets these days.
AMD to Build 153,000sqft Data Center in Georgia
AMD filed a permit to build a data center in Suwanee, Georgia.
The building will provide about 153,000 sqft of space and include, initially only one building module in a 10 year plan.
AMD built a similar strategy for its fab in Luther Forest, which is now operated by the spun off GlobalFoundries: GlobalFoundries is expanding its fabs in a module strategy as well. The data center is substantially cheaper than AMD's billion dollar fabs of the past. According to the company, the initial construction cost is estimated to be about $25 to $30 million and part of a data center consolidation approach.
Including the IT equipment, the total cost is estimated to be in the $100 million neighborhood. There was no roadmap that details the future expansion of the site. However, AMD hopes that the new datacenter will be able to help AMD "leverage changes in the business environment in terms of cost."
The building will provide about 153,000 sqft of space and include, initially only one building module in a 10 year plan.
AMD built a similar strategy for its fab in Luther Forest, which is now operated by the spun off GlobalFoundries: GlobalFoundries is expanding its fabs in a module strategy as well. The data center is substantially cheaper than AMD's billion dollar fabs of the past. According to the company, the initial construction cost is estimated to be about $25 to $30 million and part of a data center consolidation approach.
Including the IT equipment, the total cost is estimated to be in the $100 million neighborhood. There was no roadmap that details the future expansion of the site. However, AMD hopes that the new datacenter will be able to help AMD "leverage changes in the business environment in terms of cost."
Saturday, 5 March 2011
Sony Drops PSP Go Down to $150
Get the PSP slider for less now.
Late last month, Sony announced a price drop for the PSP-3000 moving it down to $129.99. That left a few wondering what would happen to the PSP Go, which at the time kept its same price. Now, however, we find out that the PSP Go has dropped down to $149.99.
It's official now on the U.S. PlayStation site here, so the price should be in effect everywhere.
Budgets aside, with the Nintendo 3DS just around the corner and the NGP/PSP2 already a sure thing, these older systems seem more like older, quick fixes for portable gaming.
We would still lean towards the $129.99 PSP-3000, thanks to its UMD slot that opens up the door for many more games at budget prices.
Late last month, Sony announced a price drop for the PSP-3000 moving it down to $129.99. That left a few wondering what would happen to the PSP Go, which at the time kept its same price. Now, however, we find out that the PSP Go has dropped down to $149.99.
It's official now on the U.S. PlayStation site here, so the price should be in effect everywhere.
Budgets aside, with the Nintendo 3DS just around the corner and the NGP/PSP2 already a sure thing, these older systems seem more like older, quick fixes for portable gaming.
We would still lean towards the $129.99 PSP-3000, thanks to its UMD slot that opens up the door for many more games at budget prices.
E-Type concept to be built
Jaguar ‘Growler’ rendering to become supercharged, 5.0-litre V8-powered reality
Posted by: Vijay Pattni,
Remember that rather delectable Jaguar E-Type concept we showed you last week?
Well, it's been scheduled for a limited production. And by limited, we mean very.
Speaking to TopGear.com, Robert Palm of Swedish design studio Vizualtech, the chaps behind the car, said: "The first car will be ready in the summer of 2012 - if planning goes as expected." See more pics of the Growler E concept car
Each ‘Growler' will be hand-built and is expected to take six months to assemble, with no more than three or four cars done simultaneously.
The car is based on the Jaguar XKR and will be built on a composite body glued to a carbon fibre chassis. Two tubular frames will then be bolted at the front and rear, which hold the steering, transmission, suspension and engine.
And that engine will be a remapped version of Jag's supercharged 5.0-litre V8 producing 600bhp. The Growler is expected to weigh around 1,550kg and, says Palm, should hit 62mph in under four seconds.
"If demand is overwhelming we might contract specialists like Steyr in Austria or Valmet in Finland," says Palm.
Pricing is tricky, but if small series production is started, expect to pay around £420,000. If only a few are made, expect to pay upwards of £850,000. Let's just hope it gets a better name...
Source: TopGear
Well, it's been scheduled for a limited production. And by limited, we mean very.
Speaking to TopGear.com, Robert Palm of Swedish design studio Vizualtech, the chaps behind the car, said: "The first car will be ready in the summer of 2012 - if planning goes as expected."
Each ‘Growler' will be hand-built and is expected to take six months to assemble, with no more than three or four cars done simultaneously.
The car is based on the Jaguar XKR and will be built on a composite body glued to a carbon fibre chassis. Two tubular frames will then be bolted at the front and rear, which hold the steering, transmission, suspension and engine.
And that engine will be a remapped version of Jag's supercharged 5.0-litre V8 producing 600bhp. The Growler is expected to weigh around 1,550kg and, says Palm, should hit 62mph in under four seconds.
"If demand is overwhelming we might contract specialists like Steyr in Austria or Valmet in Finland," says Palm.
Pricing is tricky, but if small series production is started, expect to pay around £420,000. If only a few are made, expect to pay upwards of £850,000. Let's just hope it gets a better name...
Source: TopGear
Thursday, 3 March 2011
The Hottest Apps of 2011, Week 9!
Looking for useful or fun programs? Here are some recommendations from the Tom's Guide community for the ninth week of 2011.
ZoomHot apps is a weekly rundown of the most popular apps according to our sister site, Tom's Guide. The following software are ranked from first to tenth by total downloads over the last week, making them community picks.
Unless otherwise specified, all featured apps are free, and run on Microsoft Windows 7, Vista, and XP.
Capture Fox. Here's a Firefox add-on that turns the web browser into a screencast tool. Capture Fox is great for recording video tutorials on websites, other Firefox add-ons, and even other programs. No updates for this beta app have been forthcoming since 2009 however, and it only works on Windows XP, Vista, and 7. New entry.
Horoscope. The astrology-driven app stays near the top of the list. This desktop gadget for Windows Vista and 7 provides regular updates on possible futures, based on what the user's Zodiac sign is. Remains at #2.
SkipScreen. A useful Firefox add-on designed for services like RapidShare and Megaupload, SkipScreen does as its named. It bypasses ad-filled web pages that file-sharing sites force their users to wait through. Great for surfers who are tired of seeing how a monthly fee leads to faster downloads. New entry.
WoW Explorer. Another desktop gadget for Windows Vista and 7, WoW Explorer keeps World of Warcraft players updated on the status of the different game servers they can log into and play their virtual character. Remains at #4.
FoxyTunes. At the very least, FoxyTunes lets you control music playback right from your web browser (Internet Explorer or Mozilla Firefox) or Yahoo Messenger. Supporting features include easy one-click access to lyrics, album covers, music videos, and artist bios. Users can also easily tell everyone else what they're listening to through Twitter or email. New entry.
UNetbootin. Need to create a flash drive or CD that can boot with a free OS or recovery environment like Ubuntu or Kaspersky's Rescue Disk? UNetbootin takes care of everything. A good internet connection is recommended for those who don't want to wait too long. Down from #5.
Evernote. This is the iPad client for the Evernote service. It lets users sync their notes and annotations—whether written or typed out—with an online database for easy access later on, and from other devices. Down from #6.
Omega Messenger. This app lets users manage multiple instant messaging accounts. Supported services include AIM, Windows Live Messenger, Yahoo Messenger, and even ICQ. Down from #7.
Angry Birds HD. The popular game is also available on the iPad. A bunch of colorful birds seek revenge on pigs who've stolen their eggs. Players launch the birds like catapult projectiles, so that the pig's fortresses come crashing down like a house of cards. Down from #8.
PstPassword. A utility designed to unlock Outlook PST (personal storage table) files, PstPassword is designed for forgetful users who've let their Outlook password slip away. Down from #9.
Staff Picks: Paint.NET is a worthy free Photoshop replacement. Foxit PDF Reader takes up minimal system resources. BurnAware Free is a burning app that's fully compatible with Blu-Ray burners and Windows 7. And of course, who wouldn't want to download the latest versions of Mozilla Firefox and Thunderbird?
ZoomHot apps is a weekly rundown of the most popular apps according to our sister site, Tom's Guide. The following software are ranked from first to tenth by total downloads over the last week, making them community picks.
Unless otherwise specified, all featured apps are free, and run on Microsoft Windows 7, Vista, and XP.
Capture Fox. Here's a Firefox add-on that turns the web browser into a screencast tool. Capture Fox is great for recording video tutorials on websites, other Firefox add-ons, and even other programs. No updates for this beta app have been forthcoming since 2009 however, and it only works on Windows XP, Vista, and 7. New entry.
Horoscope. The astrology-driven app stays near the top of the list. This desktop gadget for Windows Vista and 7 provides regular updates on possible futures, based on what the user's Zodiac sign is. Remains at #2.
SkipScreen. A useful Firefox add-on designed for services like RapidShare and Megaupload, SkipScreen does as its named. It bypasses ad-filled web pages that file-sharing sites force their users to wait through. Great for surfers who are tired of seeing how a monthly fee leads to faster downloads. New entry.
WoW Explorer. Another desktop gadget for Windows Vista and 7, WoW Explorer keeps World of Warcraft players updated on the status of the different game servers they can log into and play their virtual character. Remains at #4.
FoxyTunes. At the very least, FoxyTunes lets you control music playback right from your web browser (Internet Explorer or Mozilla Firefox) or Yahoo Messenger. Supporting features include easy one-click access to lyrics, album covers, music videos, and artist bios. Users can also easily tell everyone else what they're listening to through Twitter or email. New entry.
UNetbootin. Need to create a flash drive or CD that can boot with a free OS or recovery environment like Ubuntu or Kaspersky's Rescue Disk? UNetbootin takes care of everything. A good internet connection is recommended for those who don't want to wait too long. Down from #5.
Evernote. This is the iPad client for the Evernote service. It lets users sync their notes and annotations—whether written or typed out—with an online database for easy access later on, and from other devices. Down from #6.
Omega Messenger. This app lets users manage multiple instant messaging accounts. Supported services include AIM, Windows Live Messenger, Yahoo Messenger, and even ICQ. Down from #7.
Angry Birds HD. The popular game is also available on the iPad. A bunch of colorful birds seek revenge on pigs who've stolen their eggs. Players launch the birds like catapult projectiles, so that the pig's fortresses come crashing down like a house of cards. Down from #8.
PstPassword. A utility designed to unlock Outlook PST (personal storage table) files, PstPassword is designed for forgetful users who've let their Outlook password slip away. Down from #9.
Staff Picks: Paint.NET is a worthy free Photoshop replacement. Foxit PDF Reader takes up minimal system resources. BurnAware Free is a burning app that's fully compatible with Blu-Ray burners and Windows 7. And of course, who wouldn't want to download the latest versions of Mozilla Firefox and Thunderbird?
Battlefield 3 Shows Up At GDC, Looks Great
8:20 PM - March 2, 2011 by Tuan Mai -
source: PCGamer
Next major installment of the Battlefield series shows up.
Tuesday night, at the Game Developer's Conference of 2011, a select few members of the press were given a sneak peak at the highly anticipated Battlefield 3.
EA revealed a bit of the single player campaign along with an in depth analysis and preview of the game's incredible engine. The Frostbite Engine 2, unlike the console-limited engine of the CoD franchise, is designed for PCs which makes Battlefield 3 a force to be reckoned with. Look out Crysis.
Here's a snippet from PCGamer's coverage of the revealing:
Battlefield 3 comes out in the fall and if you ask us, not nearly soon enough.
source: PCGamer
Next major installment of the Battlefield series shows up.
Tuesday night, at the Game Developer's Conference of 2011, a select few members of the press were given a sneak peak at the highly anticipated Battlefield 3.
EA revealed a bit of the single player campaign along with an in depth analysis and preview of the game's incredible engine. The Frostbite Engine 2, unlike the console-limited engine of the CoD franchise, is designed for PCs which makes Battlefield 3 a force to be reckoned with. Look out Crysis.
Here's a snippet from PCGamer's coverage of the revealing:
The demo opened with a precis of the tech. Frostbite 2 uses animation systems developed for sports games to give characters heft and weight. As the soldiers turn into doorways, you can see the weight shift on their feet. The destructability of the old Frostbite engine has been ramped up; bullets can chip away at masonry and concrete, while full bore explosives can tear down entire buildings. And when buildings collapse, they don’t vanish in a cloud of smoke and magically transform into burning husks – the destruction is more complex – signage wobbles and shakes, concrete awnings tumble down. The sound is as violent and deafening as Bad Company 2; bullets echo and snap with nightmarish cracks.And if the screenshots aren't enough to give you chills, check out some video of in-game footage:
But it’s the sheer visual quality that’s the real star. I think it’s down to the lighting – the bright sunshine of the Iraq level was extremely impressive. When the demo transitioned to the indoors, shafts of sunlight shone through any open windows, creating gorgeous pillars of dust. It absolutely looked a step ahead of last year’s big shooters.
Battlefield 3 Footage
Wednesday, 23 February 2011
New Samsung DRAM Boasts of 12.8GB/s Transfers
Samsung’s been pretty busy with its successful Galaxy line of smartphones and tablets, along with the Nexus S, but the company this morning reminded us all that it’s not been resting on its laurels when it comes to hardware.
Samsung today revealed that it’s developed a 1GB DRAM for mobile devices that boasts a wide I/O interface and low power consumption to boot. The new mobile DRAM is capable of transmitting data at 12.8GB per second, an eightfold increase in bandwidth when compared to mobile DDR DRAM, and it’s made possible by the use of 512 pins for data input and output compared to the last-gen mobile DRAMs’ 32 pins. All this comes with a reduction in power consumption amounting to roughly 87 percent.
"Following the development of 4Gb LPDDR2 DRAM (low-power DDR2 dynamic random access memory) last year, our new mobile DRAM solution with a wide I/O interface represents a significant contribution to the advancement of high-performance mobile products," said Byungse So, senior VP of memory product planning and application engineering at Samsung Electronics.
"We will continue to aggressively expand our high-performance mobile memory product line to further propel the growth of the mobile industry," he continued.
Samsung’s next move is to provide 20nm-class 4Gb wide I/O mobile DRAM sometime in 2013.
Samsung today revealed that it’s developed a 1GB DRAM for mobile devices that boasts a wide I/O interface and low power consumption to boot. The new mobile DRAM is capable of transmitting data at 12.8GB per second, an eightfold increase in bandwidth when compared to mobile DDR DRAM, and it’s made possible by the use of 512 pins for data input and output compared to the last-gen mobile DRAMs’ 32 pins. All this comes with a reduction in power consumption amounting to roughly 87 percent.
"Following the development of 4Gb LPDDR2 DRAM (low-power DDR2 dynamic random access memory) last year, our new mobile DRAM solution with a wide I/O interface represents a significant contribution to the advancement of high-performance mobile products," said Byungse So, senior VP of memory product planning and application engineering at Samsung Electronics.
"We will continue to aggressively expand our high-performance mobile memory product line to further propel the growth of the mobile industry," he continued.
Samsung’s next move is to provide 20nm-class 4Gb wide I/O mobile DRAM sometime in 2013.
Subscribe to:
Posts (Atom)