AMD Next Generation ATI Radeon Eyefinity Technology

looks "sporty"

I thought the GTx295 was faster in crysis than the 4890

At long last the Crysis card is coming
 
looks "sporty"

I thought the GTx295 was faster in crysis than the 4890

At long last the Crysis card is coming

There is no way 4890 crossfire to be like 5 fps faster then GTX295 maybe 1,2 fps max! even the GTX295 is generally faster in Crysis then 2 x4890.
 
Ok I'll take your word on it.

Then again I think it more of a personal preference.

I'd rather game on a 47 inch 1080p than a 30" dell mon anyday of the week. So a bigger resolution doesn't really cut it for me.
 
There is no way 4890 crossfire to be like 5 fps faster then GTX295 maybe 1,2 fps max! even the GTX295 is generally faster in Crysis then 2 x4890.

Off topic, take it to another thread if you want to argue about 3 frames per second. :rolleyes:
 
Ok I'll take your word on it.

Then again I think it more of a personal preference.

I'd rather game on a 47 inch 1080p than a 30" dell mon anyday of the week. So a bigger resolution doesn't really cut it for me.

I got an old 640x480 projector you can shine on the wall in your living room. Give you even a bigger picture than 47". Should be right up your alley since resolution does not matter to you. Would be really fast too. JK!

Seriously, I get what you are saying. At some point there are diminishing returns at some point for everyone. I have a 50" Plasma and a 30" Dell, and personally I will use the Dell for gaming everyday over the Plasma....but it is a great HD TV.
 
This is pretty compelling stuff, but I don't know if I 'personally' would enjoy it, in it's current form.

Unless these Samsung ultra thin bezel monitors are really, really ultra thin (ie, there is barely any noticeable gap between the screens) I can't see myself using this setup for heavy MMO Gaming. It'd be mighty annoying having my character split between 2 screens. Moreover, I don't see this being conducive to moving out of fire, etc, with gaps between your viewpoint. I realize this is hyperbole, since nothing is out for my own dissemination.

But I really think this technology hinges on the panel suppliers.

That said, I think this tech. is needed to move us closer to truly submerse gaming.
 
what would be perfect for me would be getting a 1600x900 monitor with the same pixel dot pitch as the big 30" dells.

then i could run a pair of the smaller monitors in portrait mode, flanking the big 30" screen in landscape for an overall resolution of 4360x1600.

that would rock my world!
 
This is pretty compelling stuff, but I don't know if I 'personally' would enjoy it, in it's current form.

Unless these Samsung ultra thin bezel monitors are really, really ultra thin (ie, there is barely any noticeable gap between the screens) I can't see myself using this setup for heavy MMO Gaming. It'd be mighty annoying having my character split between 2 screens. Moreover, I don't see this being conducive to moving out of fire, etc, with gaps between your viewpoint. I realize this is hyperbole, since nothing is out for my own dissemination.

But I really think this technology hinges on the panel suppliers.

That said, I think this tech. is needed to move us closer to truly submerse gaming.

No matter how thin the bezel is, there will still be a grid effect.
 
I would suggest to anyone who doesn't "get" the idea of multi-monitor gaming to WAIT AND SEE FOR YOURSELF before making any judgments.

Gaming, especially FPS and driving games, on multiple monitors is wholly different than having your web browser spread across the screens with the bezel breaking up the window.

With a 3x1, you have your main monitor in front of you with all of the primary action happening there where you are focused. The other two monitors flank that, giving your peripheral vision SOMETHING IT HAS NEVER HAD BEFORE during gaming - authentic stimulation, in motion and at high resolution.

The image is not static, it doesn't just sit there, everything is in motion. The bezels will mean next to nothing.

Those making a big deal out of the bezels will discover that whatever negative impact on the experience the bezels have will be minimal at worst. Don't underestimate what your brain needs - more stimulation, more "environmental" information - and what your brain can do, which is completely ignore the bezels.
 
I wonder what the benchmarks are going to be like?

Any word on Nvidia conference and when they are going to show off their next gen?
 
Undoubtedly the long-time computer hardware enthusiast is going to harken back to the days of Matrox and empty promises of multi-monitor gaming.
Read my mind.

So how many game companies has AMD teamed up with that are going to support those kind of resolutions?
 
i'd imagine it is less of a problem these days due to the explosion of monitor resolutions that has happened in the last couple of years.

whereas in the past a developer would target:
800x600
1024x768
1280x960
1280x1024
1600x1200

now they have to contend with the additional:
1024x576
1024x600
1280x768
1280x800
1366x768
1440x900
1440x1080
1600x900
1680x1050
1920x1080
1920x1200
1920x1440
2048x1440 (?)
2560x1600

so it is much easier in a world of EDID and SVG to just pull the screen resolution from the video card and use that, scaling hud elements to fit. so any resolution should work except for backward developers who don't use this method.
 
i'd imagine it is less of a problem these days due to the explosion of monitor resolutions that has happened in the last couple of years.
....
so it is much easier in a world of EDID and SVG to just pull the screen resolution from the video card and use that, scaling hud elements to fit. so any resolution should work except for backward developers who don't use this method.

An awful lot of them are backwards developers then. EA took forever to do widescreen, and many others didn't get it right with their FOV.
 
Kyle, I noticed from pictures shot at the event that Samsung was represented there as well with their ultra-thin bezel displays.

The only problem I see is that all the Eyefinity stuff was done on Dell monitors (take a look at the article).

Did you do any Samsung coverage/are you under NDA/can you show us that stuff?
 
From what I've been reading, quite a few devs are willing to sign up for this, since all they have to do is add more resolutions in their config files to support it, if possible.

Not to mention, it makes for a great set up to demo the games.

I fully expect this to have a larger impact than most people think it would, because a lot of game types will take full advantage of the extra realestate...
 
From what I've been reading, quite a few devs are willing to sign up for this, since all they have to do is add more resolutions in their config files to support it, if possible.

Not to mention, it makes for a great set up to demo the games.

I fully expect this to have a larger impact than most people think it would, because a lot of game types will take full advantage of the extra realestate...

Read my above post. If the game grabs resolution data from OS, the game already supports the resolution(s) you can create with Eyefinity.

I'm just interested in these ultra-thin\no bezel monitors Samsung had at the event but it seems they weren't covered? There is a sign for them there. It's here!

Show us the monitors. *folds arms*
 
Went to the party, had a great time. :D

Made a short video of the DIRT 2 setup they had playable there, quality is fairly crappy since I used my iphone and it really compresses the video when it uploads it, but oh well.

http://www.youtube.com/watch?v=teE5wqT2DNU

It looked amazing in person, almost makes me want to buy a few extra monitors. :p

Thanks for the vid! Really didn't notice the grid at all; do you know what brand monitors they were?
 
Read my above post. If the game grabs resolution data from OS, the game already supports the resolution(s) you can create with Eyefinity.

I'm just interested in these ultra-thin\no bezel monitors Samsung had at the event but it seems they weren't covered? There is a sign for them there. It's here!

Show us the monitors. *folds arms*


The Samsung thin bezels are seen in this shot. 7mm on the sides, 8.5 mm on the bottom and 9mm on the top. Approximate guesses.

http://www.hardocp.com/image.html?image=MTI1MjU0MjA2MXZGMTQxM01nTDlfMV8xMl9sLmpwZw==
 
They don't need to support the resolution.
It's all done in the Catalyst Control Center and the OS. It's seamless. Check out the Anandtech coverage of the event: http://www.anandtech.com/video/showdoc.aspx?i=3635

I'll check that out. It also reminds me, that I think Galactic Civilizations 2 was made so it could scale to any future resolution. So I'll have at least one game it will be good for.
 
Do the screens have to have identical resolutions for this to work or can you mix monitor sizes, accepting loss of pixel data, (e.g. 2 x portrait + center monitor)
 
I would suggest to anyone who doesn't "get" the idea of multi-monitor gaming to WAIT AND SEE FOR YOURSELF before making any judgments.

Gaming, especially FPS and driving games, on multiple monitors is wholly different than having your web browser spread across the screens with the bezel breaking up the window.

With a 3x1, you have your main monitor in front of you with all of the primary action happening there where you are focused. The other two monitors flank that, giving your peripheral vision SOMETHING IT HAS NEVER HAD BEFORE during gaming - authentic stimulation, in motion and at high resolution.

The image is not static, it doesn't just sit there, everything is in motion. The bezels will mean next to nothing.

Those making a big deal out of the bezels will discover that whatever negative impact on the experience the bezels have will be minimal at worst. Don't underestimate what your brain needs - more stimulation, more "environmental" information - and what your brain can do, which is completely ignore the bezels.

Werd! The same concept applies to your daily driving - you won't notice your A-pillars at all when you're looking forward through your windshield, but your eyes will catch the peripheral vision of cars to your side no problem. And I don't see people complaining about there being an A-pillar for your field of view (for daily driving of course)!

A 3x1 setup looks like it would be killer - and if you're focusing forward, with the right angle setup, the bezel would probably disappear from your field of view anyways
 
1. Kyle, thanks for passing the invite along for the show yesterday - I got to go and it was *impressive*. My crystal ball is showing 3 monitors in my future. I'm sure the crystal ball is working, since I just got it back from the shop after I damaged it earlier in the year due to faulty stock picks.

2. I haven't read all of the posts, but haven't seen mention of the laptops that were shown. They had a number of laptops their driving large screen monitors that I thought was really impressive. And specifically, what I mean by impressive is not just the gaming but also some of the apps running multiple video streams.

I think this could be a home run for AMD/ATI. I've been holding off building a new primary PC/server, awaiting an Intel Gulftown processor with 12 GB RAM. Now I have a good idea what video card will be dropped in. :)

One thing I thought funny was the AMD exec doing a presentation around 6:30 pm talking about making technology that just worked - and he had a number of audio / visual issues (no sound, no video clips). I certainly sympathized with him - things invariably go wrong in presentations (been there / done that), but seeing on how he was focusing on "making technology just work" for the user, that was just too funny.
 
They don't need to support the resolution.
It's all done in the Catalyst Control Center and the OS. It's seamless.
Yeah, just like using the native res on any single monitor is seamless. Which, of course, it's not. Take a look around the WSGF for an idea of how seamless an experience you can expect without developer support.

And considering how long widescreen support took to get to where it is today, despite the fact that you don't really have a choice when buying a monitor these days, it'll be a long while before you can expect universal support for a niche technology like this.
 
I am a pretty loyal NV fan, and by that I mean I havent owned one since the x1200 days

this is pretty bad ass
 
Those are 2 different groups tho, which I assume does not appear as a logical display in the apps/games.

From what I have read, the games pulls the resolution from available resolutions in windows. That will be horizontal res x vertical res (as in 5760X1200 with 3 x 1920x1200 displays).

In this video it looks like the display to the right is larger:
http://www.youtube.com/watch?v=04dMyeMKHp4

Yeah, just like using the native res on any single monitor is seamless. Which, of course, it's not. Take a look around the WSGF for an idea of how seamless an experience you can expect without developer support.

And considering how long widescreen support took to get to where it is today, despite the fact that you don't really have a choice when buying a monitor these days, it'll be a long while before you can expect universal support for a niche technology like this.

And see what you get when using our solutions where developers fail:
http://www.widescreengamingforum.com/screenshots/wow-th2go.php

Makes it all worth it!
 
Last edited:
Are you that biased towards nVidia that you can't just say, wow nice one AMD. Christ. I was so red-biased years ago that I dedicated my name to their cards. Watch this: nVidia makes excellent products and I'm proud to own a GTX280.

See? Fanboys CAN compliment their opposing brands.
Posted via [H] Mobile Device
Huh? Where am I biased? Sorry for having an opinion that thinks multi monitor gaming is a niche. I can totally see how that is so biased. Yeahhhhhh. Surrrrreeeee. Apparently you didn't catch that I said I'm waiting to see single monitor numbers to see how strong this card is. Someone needs to learn how to read the "whole" post... :rolleyes:
 
The monitor borders would kill the positive experience for me too, only for multiple app windows or panels do I want them split in multiple displays. For that use, kudos to ATI for bringing a long overdue feature. I just hope they do better than average driver development so there aren't bugs plaguing it, drivers have always been ATI's weakness. That and Intel's IGP chipset dominance reducing their volume sales.
 
Have you guys seen this?

More info showing up by the minute <<< Source >>>

2s0xnc2.jpg
Is that 2 8-pin connectors that I see? It's hard to tell...
 
With a 3x1, you have your main monitor in front of you with all of the primary action happening there where you are focused. The other two monitors flank that, giving your peripheral vision SOMETHING IT HAS NEVER HAD BEFORE during gaming - authentic stimulation, in motion and at high resolution.
Hate to burst your bubble though, but multi-monitor gaming isn't something "new"... Some arcade manufacturers have been doing it for a long long time even before the Matrox Parhelia. For instance the Sega Racing game, Ferrari 355 Challenge had 3 screens. It's cool, don't get me wrong. You either love it or hate it. Pretty cool to finally see it showing up as a built in feature.

Ferrari 355 in motion:
http://www.youtube.com/watch?v=WgSB8WZHbL0&feature=related
 
Hate to burst your bubble though, but multi-monitor gaming isn't something "new"... Some arcade manufacturers have been doing it for a long long time even before the Matrox Parhelia. For instance the Sega Racing game, Ferrari 355 Challenge had 3 screens. It's cool, don't get me wrong. You either love it or hate it. Pretty cool to finally see it showing up as a built in feature.

Ferrari 355 in motion:
http://www.youtube.com/watch?v=WgSB8WZHbL0&feature=related

It's the ability to do higher resolutions so you can actually use those 3 30" monitors that's new... that and capabilities for 6 of em per card.

That said, what I'm really interested in now is the fact that they could run some of these games at ridiculous resolutions so the card must have some oomph in em
 
Kyle, thanks for going and checking this out for us. I've been hearing the rumors, and it's great to get some first hand confirmation. This is the first time I'm thinking about grabbing a top tier card on release day. :)

I started toying around with SoftTH the other day, which isn't bad for free, but I have a feeling the mainstream support from AMD on this will be awesome. I've been playing L4D like this every night for the last week..and it rules. :)

l4d3mon1.jpg
 
Back
Top