Login





 
Playstation 3 Dev Box Hands On >
2006/02/10 19:54:42: Posted by DM
The guys at Kikizo had a chance to get their hands on not one but three different Playstation 3 units at three different development houses. Controller in hand, they go through what the PS3 looks and feels like so far. The Kikizo guys use a regular old USB controller to play titles off a HDD unit in the dev boxes (no Blu-Ray in sight according to them), and they claim the hardware is more or less final, while the actual look of the case and control pad is not. They do say that developers are using more or less the standard old dual shock to work with, so maybe we can expect the PS3 pad to stay along those lines in its final form. The article is chock full of info, check it out.

Labeled With  sony playstation 3

Comments [40] | Rate this article:  | Avg. rating of 8.0

Related News:
 No Playstation 3 Showings And Events Until E3 Confirmed
 Uniqueauction PS3 Promotion Q&A - Yes, You Get Unit Early
 Ten Playstation 3 Units Available In March On Auction Site
 Sony Rolls Out PSP Spot Download Service In Japan
 Blu-Ray Players Will Debut On May 23
 Sony Admits PS3 Delay Possible



Comments

Written by optaviusx on 2006/02/10
PS3 maximum reso being pushed 720p... I said from the get go that very few if any ps3 games would be 1080p. Their best bet has always been 720p for a true next gen experience as the ps3's gpu is a 7800GTX 512
Written by optaviusx on 2006/02/10
Final PS3 dev kit at that.
Written by XAL on 2006/02/10
120 frames per second games at 1080p my ass Kutagari

"Unlike Xbox and PS2, where Xbox had a host of built-in effects that were a generation ahead of PS2, the Xbox 360 and PS3 are same-generation machines. One doesn't have additional effects over the other - 360 can do the same effects, just not as many of them simultaneously and with less geometry [because of the speed difference], but memory bottlenecks can kill part of the PS3 speed advantage anyway... the overall visual difference it makes will depend a lot on the developer's skill, and how much time and money the publisher spends on a game."

So basically it not even going to be 1.3x better than 360...let alone 2x better

Man Kutagari is full of it.
Written by optaviusx on 2006/02/10
Well its to be expected anyone that knew what was inside the ps3 knew it wasn't capable of what sony was saying it was capable of. Both consoles will be great though.

I personally intend on owning both. Keeping in mind that everything they compare the ps3 games they saw to on the 360 has been a rushed launch title not showcasing what the 360 is fully capable as of yet.
Written by XAL on 2006/02/10
Right-o.

It just bugs me how these arrogant execs post blatant lies...more so Kutagari than any other person in the game business...no one is that full of BS, except maybe Peter Moloyneux (or however you spell his name)

=P
Written by gooseboy (34) on 2006/02/11
Blatant lies? How about your MS bumchums...

"All games will be minimum 720p, 30fps and 2xAA (or was that 4?)..."

Yet many games are under 30fps... PGR is only 600p and NO game uses AA right now.

Not to mention the old, "Xbox 360 is water cooled" video and the 'You do NOT need a Hard drive to play any 360 games..."

Even Stevens mates.
Written by MattTS (26) on 2006/02/11
I've got loads of games using AA for 360. Only PGR3 is 600p. They never claimed it was water cooled - they said liquid cooled and I think it still has heat pipes which contain a liquid (alcohol?) which carries the heat.
Written by Guerrilla (4) on 2006/02/11
In fact tell me one game that does not use AA, i didnt see one yet, ok PGR3 doesnt make perfect use of it but it still it does have aa. And if you want to call that soccer manager thing a game, ok then youre right, they lied on that. But honestly soccer manager doesnt count for me, and i bet theres not going to be a game thats ultimative crap thats going to need the hdd (except of course mmos but that was clear from the start...)
Written by silverwolf on 2006/02/11
Any TRUE gamer knows first generation games suck. But in all honesty Xbox 360 games where pretty inpressive for it's first round. Kameo being a stunning game IMO, guys what classifies secound gen games? Is it after a year or just the following games after launch?

So critizing a tangible Xbox 360 with 1st gen games isn't the way to go.
Written by The G (9) on 2006/02/11
'So critizing a tangible Xbox 360 with 1st gen games isn't the way to go.'

Yet critisizing ps3 based on what someone else says and not your own use of the machine is the way to go?

I agree with most of you that sony lied but they are playing smart. Most gamers have no idea about the diffrence from 720p to 1080p, they have already bought into the idea that ps3 is 2x better than 360 so all sony have to produce now, is produce games that look slightly beter than 360's and feed them another lie that the best grphics of a console dont come out to later in its life. Produce another awfull gta game, and then after that sony win.

I my self am not going to judge the ps3 untill i see with my two eyes what the machine does. Everyone no matter what they say is biased one way and another.And finally the best console will be judged on games and not graphics.
Written by DaGamer (33) on 2006/02/11
I couldn't read the whole article(the site is loading really slow and I can't load the 3rd page), but what I have read, nothing really surprised me. If you expect that PS3 launch titles will be much better than 360 games, then you are an idiot. Give the PS3 2-3years and you will see the differance.

BTW I never red there is a bottleneck in the PS3 system, 360 has a bottleneck with it's EDRAM.
Written by optaviusx on 2006/02/11
DaGamer are you kidding??? The EDRAM doesn't have a single bottleneck the ps3's low amount of memory bandwidth will bottleneck it. The 360's EDRAM has a bus rate of 2GHZ between it and the GPU thats like 3 times faster than any high end ram.

The EDRAM has inside it a logic unit that can transfer data with the 10MB of ram at 2 terabits per second... The memory bandwidth on that 10MB of edram alone is 5 times that of the memory banwidth of ENTIRE ps3. 48GB/s for the ps3 256GB/s from just the 10mb of edram. Add the 22.4GB/s and the 360 has a total of 278.4GB/s



The ps3's graphics card is a bottleneck to performance as well as it is incapable of operating at its full capability due to it using the older dedicated vertex and pixel shader setup. While the 360 is using a unified shader architecture combined with the 10MB of edram the 360 gpu is running at 95-99% efficiency with 4X Multisampling Antialiasing.

I've yet to see a 360 game without antialiasing.

The big deal here is that the ps3 isn't nearly as overpowered compared to the 360 as it was hyped to be and ps3 devs telling them that both consoles can do the same things is 100% inaccurate. Keep in mind everything those guys mentioned along the lines of what the 360 can do is ASSUMING the 360's capability has been maxed out.. I don't think i need to tell anyone here that we haven't hit the tip of the iceberg yet with the 360.

One thing I'd like to have everyone remember the 360 is capable of directx 10 the ps3 gpu isn't. Just as developers need time to learn about the ps3 so do devs need time to learn about the 360. All 3 cores have yet to be all fully utilized.

The reason why the ps3's graphics setup is a bottleneck is because while a pixel shader operation is being performed the vertex operation has to wait for the pixel operation to complete and the same goes if a vertex operation is being performed first the pixel op has to wait. There are only 24 pipelines for the ps3.

The 360 not only has 24 more pixel pipes capable of handling pixel ops it also has 40 more pipes capable of handling vertex operations. The 360 gpu has 48 pipelines that can accept BOTH pixel and vertex shader operations simultaneously. Just think about that for a second.
Written by silverwolf on 2006/02/11
"Yet critisizing ps3 based on what someone else says and not your own use of the machine is the way to go?" No it's not you're right. But claiming two times better than the Xbox 360 isn't right as well. This has been stated not only by Sony's head honcho but by gamers who took his word for it. Now you hear fans say "I'll wait to to comes out or first gen games don't count". But when the xbox 360 came out it was called the xbox 1.5 and many who didnot own a xbox 360 said that the games looked like a hi res xbox.

A true gamer knows first gen is never as good. While some may say wait a few years and you'll see the power if the PS3 I say "NO SHIT". The same thing applies to the Xbox 360, not only that. But developers will have more time with Xbox 360 and may make full use of its power before the PS3 does.
Written by optaviusx on 2006/02/11
We mention lies, but atleast they weren't outrageous claims that people didn't see coming. Everyone knew there'd be games that would require an hd. Like more than 90% of 360 owners have the hard drive anyway. With the exception of pgr3 all games are 720 and have AA. Other than that microsoft has delivered on each and every claim as we are seeing.

Sony guaranteed a console that BLOWS the 360 away in every way possible. 720p is outdated the TRUE hi-definition is 1080p. We'll be able to do simultaneous 1080p outputs at 120fps is flat out lying to customers. Called the xbox360 xbox 1.5 if they are alot similar than sony let on then is the ps3 ps2.5?

haha silver took the words out of my mouth
Written by XAL on 2006/02/11
JAllard's and PMoore's "lies" are dwarfed by Kutagari's lies several times over.

You can play 360 games w/o a HDD, that means that the only thing that will affect non-HDD users is slightly longer load times (for SOME games) there are no graphical differences.

What are you talking about? There is anti-aliasing.

I thought PGR3 is 700p, and not 600p.

NO 360 game drops below 30 FPS, PGR3 is ALWAYS above 40 FPS - I doubt you have even played a 360 for more than 20 minutes.

Who ever said contrary to the above is a pathetic fanboy.

What's a bigger lie: all games are 720p and ONE LAUNCH GAME runs at 700p - or all games will run at 120fps at 1080p with like 6x aa, but in reality they're barely able to run at 720p and the framerate is just about around 30fps. Use your damn brain, moron.
Written by The G (9) on 2006/02/11
I agrree with you optaviusx and silverwolf. These were blatant lies, downright untruthfull lies and sony once again, completly misguding consumers and people in the gaming industry.

But as i said before gamers generally have no idea about the diffrence from 720p and 1080p they dont understand it and dont care.

Sony have generated enough hype to make sure ther console sells.

Also the fact that not all games are 1080p may be something to try and keep costs down and make sure they have develpoers on thier side. As we all know the ps3 cost of production may be something never seen in gaming scince the neo-geo days.

And besides in the end consoles will be judged by thier games rather than thier graphics

Written by optaviusx on 2006/02/11
By the way I just looked at Project Gotham Racing 3 the back of the box and it clearly says HDTV 720p. So I'm not sure where this 600p thing comes from...
Written by optaviusx on 2006/02/11
Hell need for speed most wanted is a first wave title for the 360 and check out the environments in that game... Trees are swaying from the wind, the roads look awesome in general there alot of great things at work in that game that grab your attention.

Alot of people don't truly realize that microsft had a incredibly solid launch without many of their Triple A developers being involved. The next time we see something from Rare its going to be major. If only rare had more time with final 360 dev kits to make pdz even more impressive than it was, but nonetheless pdz is still extremely fun to play. One such example that a game isn't all about graphics, but gameplay and pdz delivers in that area. The later missions are extremely fun and the multiplayer is pretty addicting.

I can guarantee that i-8 is going to be one of the ps3's most impressive games it has the potential to do alot better than cod 2 did for the 360 on a much larger scale. Those developers are top of the line and I'm almost positive I-8 was one of the 3 games that wowed kikizo in their hands on with the ps3.
Written by lord kuragari on 2006/02/11
Your all shills of some whore marketing company
Written by DaGamer (33) on 2006/02/12
"The EDRAM doesn't have a single bottleneck the ps3's low amount of memory bandwidth will bottleneck it."

This would only be the case if they make the rsx without any vertex shaders (and let the cell do all the vertex processing), whats very unlikely. BTW you probably also ain't aware of the advantages using separated memory. Since the RSX is connected to the XDR DRAM and GDDR3 RAM similar to a Turbo Cached GPU it can access both memory locations at the exact same time. This gives the RSX an effective 48 GB/s when sending data to/from GPU and RAM.

"The 360's EDRAM has a bus rate of 2GHZ between it and the GPU thats like 3 times faster than any high end ram."

Do you actually understand what you are saying? This just means in other words 32 GB/s GPU to eDRAM bandwidth.

"The EDRAM has inside it a logic unit that can transfer data with the 10MB of ram at 2 terabits per second... The memory bandwidth on that 10MB of edram alone is 5 times that of the memory banwidth of ENTIRE ps3."

Same song again, when will you understand that it is 256 GB/s eDRAM internal logic to eDRAM internal memory bandwidth, this means it's only within the edram. The real bandwidth between edram and the GPU is 32 GB/s.

"48GB/s for the ps3 256GB/s from just the 10mb of edram. Add the 22.4GB/s and the 360 has a total of 278.4GB/s"

You absolutely have no idea how it works, you can't add it, but if you do, so you must have forgotten that the PS3 has 25.6 GB/s GPU to XDR DRAM, 22.4 GB/s GPU to GDDR-3 VRAM, 35 GB/s GPU to CPU, 5 GB/s System Bus, 204.8 GB/s Cell EIB and 76.8 GB/s Cell FlexIO Bus. This makes a total of 369,6 GB/s. The 360 on the other hand has 22.4 GB/s memory interface bus bandwidth, 256 GB/s eDRAM internal logic to eDRAM internal memory bandwidth, 32 GB/s GPU to eDRAM bandwidth, 21.6 GB/s front side bus and 1 GB/s southbridge bandwidth. This makes a total of 333 GB/s. But as I said earlier, this doesn't work that way.

"The ps3's graphics card is a bottleneck to performance as well as it is incapable of operating at its full capability due to it using the older dedicated vertex and pixel shader setup."

Again you don't have a clue what you are talkin about, this has nothing to do with bottleneck. There are 48 shader units in the Xbox 360 GPU, but RSX's specialised shader pipelines are more powerful than the Xbox 360's pipes. Fact is, that the shader processing power of the Xbox 360 GPU is roughly similar to that of a specialised 24-pipeline ATI R420 GPU. Add also to this fact, that the RSX will most likely have 6 additional vertex pipes to the 24 pixel pipelines. Here is again the comparison:

Xenos:

48shaders * 10FLOPs per shader * 500Mhz = 240 GFLOPs

RSX:

24pixel shaders * 16FLOPs per shader * 550Mhz = 211.2 GFLOPs

8Vertex shaders * 10FLOPs per shader * 550Mhz = 44 GFLOPs

PS3 211.2 + 44 = 255.2 GFLOPs > 240 GFLOPs Xbox360.

"While the 360 is using a unified shader architecture combined with the 10MB of edram the 360 gpu is running at 95-99% efficiency with 4X Multisampling Antialiasing."

False. A short quote from anandtech: "Although we were originally told back at E3 that all Xbox 360 titles would support 4X AA, it seems that the statement has since been revised to 2X or 4X AA." Also with the shift to the unified shader architecture, it is even more imperative to make sure that everything is running at maximum efficiency.

"The big deal here is that the ps3 isn't nearly as overpowered compared to the 360 as it was hyped to be..."

What do you mean with overhyped? Nobody expected that the PS3 will look much better from the start. But already NOW "something that PS3 does better than Xbox 360, most straightforwardly, is more stuff simultaneously. An impressive list of simultaneous, wonderfully shaded, dynamic visual effects was evident, but PS3 was also able to throw around tonnes of geometry in terms of realtime 'explosion' calculation - and convincingly affect dozens of objects all at the same time."

"Keep in mind everything those guys mentioned along the lines of what the 360 can do is ASSUMING the 360's capability has been maxed out.. I don't think i need to tell anyone here that we haven't hit the tip of the iceberg yet with the 360."

Well but they are probably nearer the tip of an iceberg than Sony. "Realistically, as libraries and experience with both machines grow, I think the PS3 will start showing things the 360 will choke at."

"One thing I'd like to have everyone remember the 360 is capable of directx 10 the ps3 gpu isn't."

Sure, but why? Because The RSX works with OpenGL 2.0 ES and not with DirectX. OpenGL is not being updated as fast as DX because it's an open environment and developers can create custom shaders and are far more open handed when working with it. In fact, it's more software based than DX and is used for professional works not just games. DirectX is a closed and limited environment created by Microsoft just for games and needs to be updated regularly. BTW 360 supports DirectX 9.0 with Shader Model 3.0 and LIMITED support for future DX 10 shader models, so the truth is that 360 doesn't fully support Dx10. For example with Dx10 comes Shader Model 4.0 and 360 doesn't support it.

"Just as developers need time to learn about the ps3 so do devs need time to learn about the 360. All 3 cores have yet to be all fully utilized."

But we all know that 360 is easier to develope for, so it will take little longer until the PS3 devs can take the full advantages of the PS3 power.

"The reason why the ps3's graphics setup is a bottleneck is because while a pixel shader operation is being performed the vertex operation has to wait for the pixel operation to complete and the same goes if a vertex operation is being performed first the pixel op has to wait."

Once again this ain't a bottleneck. You forgot that the unified shaders are not as powerfull as specialised shaders. This means they need more time to do the same operation than specialised shaders. Also a pixel shader would need far, far more texture math performance and read bandwidth than an optimized vertex shader. So, if you used that pixel shader to do vertex shading, most of the hardware would be idle, most of the time. Which is better – a lean and mean optimized vertex shader and a lean and mean optimized pixel shader or two less-efficient hybrid shaders? Let's make another little comparison what the shaders are capable of:

RSX Multi-way programmable parallel floating point shader pipelines:

136 shader operations per clock

74.8 billion shader operations per second

33 billion dot products per second (51 billion dot products with CPU)

Xbox360 parallel floating-point dynamically-scheduled shader pipelines:

96 Shader operations per cycle across the entire shader array

48 billion shader operations per second

Dot product operations: 24 billion per second or 33.6 billion per second theoretical maximum when summed with CPU operations.

The only visible benefit of unified shaders is that it allows them to pack more functionality onto fewer transistors as less hardware needs to be duplicated for use in different parts of the chip and will run both vertex and shader programs on the same hardware, thats it.

"There are only 24 pipelines for the ps3."

False, most sources are stating that the RSX will most likely have 6 more vertex pipes in addition to the 24 pixel pipelines. I'm realy askin me where do you get your informations from.

"The 360 not only has 24 more pixel pipes capable of handling pixel ops it also has 40 more pipes capable of handling vertex operations."

They don't have 24 more pixel pipes and 40 more pipes capable of handling vertex operations. 48 unified shaders vs 30 specialised shaders.

"The 360 gpu has 48 pipelines that can accept BOTH pixel and vertex shader operations simultaneously. Just think about that for a second."

Already aswered before. PS3 shaders have greter FLOP performace, more shader operations per second and more dot products per second. Just think about that for a second.
Written by DaGamer (33) on 2006/02/12
"A true gamer knows first gen is never as good. While some may say wait a few years and you'll see the power if the PS3 I say "NO SHIT". The same thing applies to the Xbox 360, not only that. But developers will have more time with Xbox 360 and may make full use of its power before the PS3 does."

Sure, but MS will surely meet it's limits than Sony. I expect PS3 lauch games(of course not all) in the same quality as 360 games at that time.

"Other than that microsoft has delivered on each and every claim as we are seeing."

Where is HD-DVD?
Written by DaGamer (33) on 2006/02/12
"Sony guaranteed a console that BLOWS the 360 away in every way possible. 720p is outdated the TRUE hi-definition is 1080p. We'll be able to do simultaneous 1080p outputs at 120fps is flat out lying to customers."

And it does, now it depends all on the devs. They never said 720p is outdated, just not the true HD.

In an unbiased article from June 24th, 2005:



"In the end, Sony’s support for 1080p is realistic, but not for all games. For the first half of the console’s life, whether or not game developers enable AA will matter more than whether 1080p is supported. By the second half, it’s going to be tough to say.

Microsoft’s free 4X AA is wonderful and desperately needed, especially on larger TVs, but the lack of 1080p support is disappointing. It is a nice feature to have, even if only a handful of games can take advantage of it, simply because 1080p HDTV owners will always appreciate anything that can take full advantage of their displays."

And I doubt that they ever said they will be able to do simultaneous 1080p outputs at 120fps. Dual Screen Output works only with 480p and I know it since the PS3 specks were released.

"By the way I just looked at Project Gotham Racing 3 the back of the box and it clearly says HDTV 720p. So I'm not sure where this 600p thing comes from."

Just another MS lie. Nov 16, 2005 from an unbiased source:

"There have been discussions involving at least one Xbox 360 game (Project Gotham Racing 3), rendering internally at a lower resolution and having the Xbox 360's TV encoder upscale it to 720p."
Written by optaviusx on 2006/02/12
Dagamer do you even know what you are talking about? You do realize that ati's vp of engineering himself stated that the 360 runs at 95-99% efficiency with 4Xaa right?

The only reason why he said at the zero hour event said 2xaa is because he wasn't 100% sure about the 360's gpu.

The rest I wont even bother touching because you have no understanding the 360s graphics technology. What you just stupidly said was that a dedicated vertex and pixel setup can outperform a unified shader architecture???

Written by DaGamer (33) on 2006/02/13
Only cause ati's vp said that, does it really means thats the truth? Please. You should refer to some objective sources and not to what some ATI or MS guy is saying.

You put way too much value in this unified shader hokus pokus. It has it's advantages, like that it has greater efficiency, cause it packs more functionality onto fewer transistors and runs vertex and shader programs on the same hardware. But does this mean it's more powerfull? They use the hardware maybe more efficient, but in the same time they give this advantage away cause unified shaders are slower than specialised and the RSX will also have a higher clock rate than Xenos.

Just read the articles, in none of those you will find that Xenos is more powerfull than RSX. Even the not so PS3 friendly gamespot says that the shader processing power of the Xbox 360 GPU is roughly similar to that of a specialised 24-pipeline GPU.

The newest ATI witout unified shaders also beats Xenos.

X1900: 48 pixel pipes + 8 vertex pipes = 56 shader pipes @650 MHz

Xenos: 48 unified shader pipes @500 MHz

Xenos doesn't even outperform GF7800GTX and RSX is more powerfull. More pipes and a higher clock rate than 7800GTX.
Written by DaGamer (33) on 2006/02/13
Nvidia was also aware that unified shaders are surely the future, it has it's benefits like that it brings the ability to dynamically allocate chip resources depending on the demand for pixel and vertex processing and simplifies software development. But you simply can't expect too much from the first gen unified shaders. This technology needs surely it's time, before they find ways to use the shaders as fast as specialised.

MS is also forcing this technology, cause the unified pixel and vertex data processing is a required capability for Windows Graphics Foundation 2.0 that comes out together with Microsoft’s next-generation operating system called Windows Longhorn. So as you see nvidia didn't have a choice and had to base the G80 on unified shader architecture.



In the interview this ATI guy actually admitted that there are some penalties. If there would be no issues with 4xAA he wouldn't say both when asked 2xAA or 4xAA. This is also what I was mainly refering to: "The logic and embedded DRAM on the daughter die is what allows the Xbox 360 GPU to essentially offer "free" anti-aliasing, which Microsoft enforces through requiring developers to support a minimum of 2X AA in all Xbox 360 titles. Although we were originally told back at E3 that all Xbox 360 titles would support 4X AA, it seems that the statement has since been revised to 2X or 4X AA."
Written by silverwolf on 2006/02/13
"360 can do the same effects, just not as many of them simultaneously and with less geometry [because of the speed difference], but memory bottlenecks can kill part of the PS3 speed advantage anyway" Sums it up pretty well, small advantage made useless by memory bottleneck. Another thing that might kill PS3's advantage is in software AA and the hit it brings. I'm guessing they will have to choose between added more effects on screen or utilizing AA. So Sony's legendary jaggies might be showing up on most games.
Written by optaviusx on 2006/02/13
Since when did the first generation of new videocard technology fail to outperform the previous gen?

Microsoft has set the rules for directx for years and hasn't the rules microsoft set always been met by ati and nvidia and ultimately made their videocards better than the previous generation? DaGamer a MINIMUM of 2x doesn't mean they are forced to use that. I see where the problem is here you currently don't understand how the 10MB of edram works for the 360 and why 4XAA + HDR is achieved at no performance loss. I'll give you a link explaining what didn't happen for the 360's earlier titles. There is a tiling method that developers must use to make it so that 4XAA among other effects is achieved at practically zero performance loss.
Written by optaviusx on 2006/02/13
h**p://www.beyond3d.com/articles/xenos/index.php?p=05#tiled

Pay attention to the first chart notice at 720p or above the memory requirement is MORE than what the 10MB of edram can handle? That right there is what took place with most 360 launch titles.

Now go all the way to the bottom of the page and you'll see what is going to be happening in all 360 titles from now on. Notice even at a resolution of 1920x540 the 360 is using less than 5MB of the 10MB of EDRAM. That there is the reason why microsoft revised its statement to a minimum of 2xAA because they didn't want to force developers to have to do something that required 18.1MB more than what the 10MB of ram could provide.

They instead chose that developers atleast used 2xAA at 720p which again went over the 10MB of ram the edram could provide, but it only went over by 4MB... such things hindered the 360's gpu in many launch titles as now the the 22.4GB/s of GDDR memory bandwidth on the 360 gpu now had to be used up for the things the edram is suppose to take care of.

This caused many wasted resources in early 360 launch titles the console's video and system memory were also no doubt effected by this sloppy use of the gpu's edram.

Written by optaviusx on 2006/02/13
If developers don't use the tiling method it requires more of the edram's memory to do anything and at 720p even with 2xaa it goes over the 10MB edram limit.

NoAA 2XAA 4XAA

1280x720 7.0MB 14.0MB 28.1MB

1920x540 7.9MB 15.8MB 31.6MB

Notice even with no aa it was close to fully using all of the edram's 10MB. NOW with the proper tiling method which fight night round 3 developers doa 4 ghost recon, tomb raider, too human, blue dragon, mass effect, lost odyssey, Halo 3, practically all 360 devs from now on will be using it. Look how much less of the edram is used up when they use the proper tiling method. The edram by itself takes care of alot of things that all other gpus out would have to do themselves relieving the 360's gpu of these tasks.

NOAA 2XAA 4XAA

1280x720 1MB 2MB 3MB

1920x540 1MB 2MB 4MB



No 360 launch title had that luxury, but now all upcoming titles for the 360 will make use of this.
Written by DaGamer (33) on 2006/02/13
There are no memory bottlenecks, people are just assuming that the Cell will maybe do all the vertex processing what surely wouldn't be too smart cause of the 22.4GB/s local memory bandwidth. Or you are refering to the SPEs as being in-order with no cache, threads run on these processors only have access to 256KB of local memory, which is reasonable for a cache, but not much in the way of memory. So these SPEs will depend on having low latency access to memory in order to keep their pipeline filled and actually contribute any useful performance to the system.

Of course the new gen of cards will outperform the last, but this mainly cause of higher clockrates, bandwidth, dx10 or shader 4.0. With the previous gens they also raised the performance with more pipes or more performance per pipe. With unified shaders this will be difficult, as of now they simply can't make as fast pipes as specialised.

I know about this tiling method, just thought it causes some problems if they reduced the minimum to 2X AA.
Written by optaviusx on 2006/02/13
Microsoft was just covering themselves simply because they knew they delivered final dev kits to developers much too late in august 2005 and launch was nov 2005. So they were avoiding saying 720P 4XAA when they thought that probably wouldn't be the case, but all 360 titles will be using this from now on.

The minimum as far as developers are concerned now is 4XAA. The advantage of a next gen graphics card is never because of the clock rates I mean look at nvidia's geforce fx 5800 and 5900 product lineups which dominated the ati 9700 and 9800 in clock speeds, but still got thoroughly outperformed. Thanks to the unified shaders they don't need to be as fast, they are more versatile and get alot more work done than is possible on previous videocards.

The bottlenecks that can occur are not enough memory bandwidth for the ps3 gpu to handle tasks such as 4XAA + HDR + softshadows self shadowing etc without SERIOUSLY impacting the gpu. The ps3 gpu has like 48GB/s of memory bandwidth, but the 10MB of EDRAM on the 360's gpu has 256GB/s of memory bandwidth by itself that right there is more than 5 times what the ps3 gpu has for memory bandwidth. Ati did a damn good job eliminating any bottlenecks.

Another bottleneck is the way a specialized pipe works it can only accept one type of instruction per pipe which is what prevents any gpu with a specialized pipe setup from staying as close to its peak performance as posssible because of the stalling of the pipes that has to happen in a specialized setup.



HDR + AA simultaneously in 360 games is like nothing for the 10MB of edram whereas no nvidia gpu to date can perform both those tasks simultaneously not even the 7800GTX 512 which is what the ps3 gpu is.

You are right with a unified shader it WILL be difficult to get more performance per pipe, but ati and microsoft pulled it off. 16 gigasamples per second fillrate the 7800GTX 512 does 13.2 gigasamples the x1900xtx which outperforms the 7800GTX 512 only does 10.4 gigasamples.

The rsx does 74.8 billion (100 billion with cell) shader operations per second 136 shader operations per clock exactly what the 7800GTX does.

The 360 gpu on the other hand does 96 billion shader operations per second and obviously does more per clock if it does more per second.
Written by DaGamer (33) on 2006/02/13
"The advantage of a next gen graphics card is never because of the clock rates"

I never said it is because of the clock rates. Besides clock rates, bandwidth, pipes and performance per pipe are the main factors for the performance.

"I mean look at nvidia's geforce fx 5800 and 5900 product lineups which dominated the ati 9700 and 9800 in clock speeds, but still got thoroughly outperformed."

Because they used much less pipes.

"The ps3 gpu has like 48GB/s of memory bandwidth, but the 10MB of EDRAM on the 360's gpu has 256GB/s of memory bandwidth by itself that right there is more than 5 times what the ps3 gpu has for memory bandwidth."

"HDR + AA simultaneously in 360 games is like nothing for the 10MB of edram"

I already told you this doesn't work that way. This have nothing to do with HDR, the 10 MB EDRAM can only use 256GB/s internally and also only for the tasks it's specialised for like z and stencil operations, color and alpha processing and anti aliasing. When it comes to tasks between EDRAM and GPU the bandwidth is only 32GB/sec, the EDRAM is even unable to assist with simple texturing, not to speak HDR.



"whereas no nvidia gpu to date can perform both those tasks simultaneously not even the 7800GTX 512"

Actually even the 7800GTX 256 runs HDR + AA simultaneously IF it's HDR without using a floating-point frame buffer. Valve used this brand new HDR technology on their single-level demo called Lost Coast. Half-Life 2 used to be ATI's game, often performing better on ATI's hardware if paired against similarly priced Nvidia cards. With the new Lost Coast level, the tables are turned. Even a standard 7800 GTX runs through the test faster than the Radeon X1800 XT, and XFX's new 512 version beats ATI's best card at that time by as much as 40%. You can be sure than Sony will use this type of HDR for their games.

"which is what the ps3 gpu is."

False, at the moment there is no comparable card to the RSX available. NVidia stated that at the time the PS3 will launch, there will be probably also comparable cards available and this means at least 7900GTX 512, Nvidias answer to x1900.

"You are right with a unified shader it WILL be difficult to get more performance per pipe, but ati and microsoft pulled it off."

They didn't pull it off, their unified shaders are of course more efficient but they are slower than specialised shaders. BTW they can only get significally better performance per pipe with new and better pipe architecture and not just with software.

"The rsx does 74.8 billion (100 billion with cell) shader operations per second 136 shader operations per clock exactly what the 7800GTX does.

The 360 gpu on the other hand does 96 billion shader operations per second and obviously does more per clock if it does more per second."

Look it up, the 360 gpu makes 48 billion shader operations per second and 96 Shader operations per cycle. You can't deny the facts, the PS3 beats the 360 in FLOP performance, shader operations per second and cycle (although this shouldn't make too big of a difference, cause 360 uses it's shaders more efficient), dot products per second, bandwidth, memory system(cause of the seperated memory the PS3 can access both memory locations at the exact same time, what almoust doubles the memory bandwidth).

There is no doubt, the future lies in the unified shaders, but this gen with it's architecture can't take the full advantages of this technology.
Written by optaviusx on 2006/02/13
DaGamer you are 100% wrong the 360 gpu does 96 billion shader operations per second and i'll prove it.

In response to my comment "I mean look at nvidia's geforce fx 5800 and 5900 product lineups which dominated the ati 9700 and 9800 in clock speeds, but still got thoroughly outperformed." You said because they used much less pipes. The 360 has twice the number of pipes the rsx has. The rsx has 24 360 48. You are only proving my point.



360 only capable of 48 billion shader ops per second? You are misinformed my friend.

"First off, we reported on page 2 in our chart that the capable “Shader Performance” of the Xbox 360 GPU is 48 billion shader operations per second. While that is what Microsoft told us, Mr. Feldstein of ATI let us know that the Xbox 360 GPU is capable of doing two of those shaders per cycle. So yes, if programmed for correctly, the Xbox 360 GPU is capable of 96 billion shader operations per second. Compare this with ATI’s current PC add-in flagship card and the Xbox 360 more than doubles its abilities"

said by ati's vp of engineering i'll link you to it in forums since I can't post a link.



You keep ignoring facts and making these accusations that ati didn't succeed in fully taking advantage of the unified shader architecture setup. Incase you weren't aware MICROSOFT is the company that sets the rules for directx and the best ways to take advantage of it.

The xbox360 just happens to be microsoft's console.. You think you have a better understanding of this technology than ATI and Microsoft Engineers? When both collaborated to make the gpu in the 360?
Written by DaGamer (33) on 2006/02/14
*Edited: No links*
Written by DaGamer (33) on 2006/02/14
And I don't make accusations that ati didn't succeed in fully taking advantage of the unified shader architecture setup. You don't have a clue about hardware development, do you really think the first gen of specialised pipes worked as fast as they do now? Unified shaders are a brand new technology and it will take a few gens of PC graphiccards until they work as fast as specialised shaders of this gen.
Written by optaviusx on 2006/02/14
DaGamer I'm sorry to say you have absolutely no idea what you are talking about. You think ati or nvidia are a bunch of amateurs? So you are sitting there boldy saying that the 7800GTX will outperform nvidia's g80 which will have a unified shader architecture SIMPLY because its brand new technology?

So let me guess Dagamer we are suppose to take everything said by sony and nvidia as fact, but anything said by ati or microsoft as false?

That info on that site is incorrect incase you didn't know my friend ;) Microsoft accidentally said the 360 gpu does 48 billion shader operations per second because they didn't know as much about the chip itself as ati themselves did. Ati clearly let them know it was 96 billion shader operations per second no less.
Written by optaviusx on 2006/02/14
*Edited: You posted links, not allowed. Post removed*
Written by cchris99 (4) on 2006/02/14
ha ha ha, it very interesting to see that xbox fans always Believe someone that has nothing to with sony, then later when dispels all theses rumours you guys always seem so shock, i mean they don't even have any proof and i seem to recall this same website says that most of the devs don't have the final hardware all later sony says they released over 4000 dev kits.

next time only listen to sony and thoughs informants that are payed by microsoft to make up some lie
Written by DaGamer (33) on 2006/02/15
I'm not saying the 7800GTX will outperform nvidia's g80, I'm just saying the performance per pipe will be worse. Why do you think 360 GPU has 48 shaders? Cause it would be outperformed by todays 24 specialised shaders bigtime. As I remember the 7800GTX came just before the 360 came out. G80 will come a year later and they will outperform the 7800GTX cause of more shaders and better shader architecture the 360 GPU has.

I don't take everything Sony says for granted, but some reliable sources. Sony didn't say that the 360 GPU has a performance of a 24 specialised shader card, but many sites that have nothing to do with Sony and that are also critical with Sony.

This info ain't incorrect, do the math:

2 Shader operations per pipe per cycle

*

48 pipes

=

96 Shader operations per cycle across the entire shader array

Shader performance: 48 billion (48,000 million) shader operations per second (96 shader operations x 500MHz)
Written by cchris99 (4) on 2006/02/15
but remember sony competition isn't the xbox but hd-dvd,and the RSX can't becompared to the 7800GTx because it work together will the cell processor so it's a totally different GPU, and that is why it able to do CGI graphics in realtime which is what sony and nvidia have been saying since e3 but nobody wants to believe them
Submit your own Comment

Name

E-mail

Comments


Recent Articles

03/01 

UniqueAuction Playstation 3 Auction Q&A

02/09 

Browse The Internet On Your Xbox360

02/07 

The Godfather Cop Chase & New Jersey Q&A

01/09 

Irth Online Q&A

01/05 

Circumvent The Xbox Live Demo Download Region Lockout!

12/07 

Ghost Recon Advanced Warfighter Developer Chat

12/01 

Use Any Headset With Xbox360 Live Adapter Mod

11/30 

Elder Scrolls IV: Oblivion Xbox360 Q&A

11/14 

TGS: Disappointment

11/14 

The Greatest Rivalries In Gaming



03/02 (51 Total)



03/02 (4 Total)



03/02 (13 Total)



03/02 (9 Total)



03/02 (9 Total)



03/01 (24 Total)



03/01 (5 Total)



03/01 (3 Total)



03/01 (3 Total)



03/01 (26 Total)
© 2017 GamersReports.com. All Rights Reserved. Privacy Policy