2024-04-19, 20:33 *
Welcome, Guest. Please login or register.

Login with username, password and session length
 
Pages: [1] 2
  Print  
Author Topic: Video Cards (nVidia vs. ATI, again)  (Read 9480 times)
0 Members and 1 Guest are viewing this topic.
Phoenix
Bird of Fire
 

Team Member
Elite (7.5k+)
*********
Posts: 8805

WWW
« on: 2004-05-14, 02:24 »

I'm sure everyone's been ogling over benchmarks and feeling the ATI X800 is the "king of cards" by now.  Well, not so fast.  You might look into this article:

http://www.pcper.com/article.php?aid=40&type=expert

I found this part intriguing myself:


Quote
Looking at the $399 price-point, we see that there is considerable potential for overclocking the GeForce 6800 GT. This board offered faster performance and equivalent image quality to the Radeon X800 pro. Equipped with the same 16 pipelines seen on the flagship GeForce 6800 Ultra, there will be many who try and coax the additional 50MHz from the core and save themselves $100. Further considering how many vendors will likely spec faster memory, the odds of getting a healthy boost over stock speeds seems in your favor. When you take these facts and realize that it is a single-slot solution with one Molex connector, it seems convincing to pick the GeForce 6800 GT as the best bang for the buck for this price-point.

In other words, you can OC this card and get the same performance as from the 6800 Ultra, and the 6800 Ultra is just about on par with the X800XT for most purposes.  $100 less, and near-equivalent performance?  That's another word for "market share".  I'd say nVidia's not "down for the count" yet, especially if you read on to the last page of the article and look at the "Concerns" sections between the cards, specifically dealing with an interesting anomaly that's present with the current ATI reference cards.
Logged


I fly into the night, on wings of fire burning bright...
games keeper
 

Elite
*
Posts: 1375

« Reply #1 on: 2004-05-14, 16:12 »

1) ATI's VGA quality was higher then any  nvidia cards since now maybe .

2) have you seen the cooling block on 1 of those things , it takes up another PCI slot , ;eaning 2 to 3 pci slots .

3) I thought in an artical they said that the geforce6 uses 460W . (altough I finally bought a 550W PSU I dont think im gonna buy a second PSU )
Logged
OmEgA-X
 

Beta Tester
Hans Grosse
********
Posts: 281

« Reply #2 on: 2004-05-15, 00:13 »

go nvidia!  Thumbs up! i dont care about the pci slots..i got like 5..i just care about performance..and  money of course  <3
Logged
Phoenix
Bird of Fire
 

Team Member
Elite (7.5k+)
*********
Posts: 8805

WWW
« Reply #3 on: 2004-05-15, 04:56 »

To answer those concerns Games Keeper...

1)  The quality between the two cards is about equal at the moment.

2)  The 6800GT does NOT take up two slots.  If you had read my quote you'd know this.  Here's the part you missed:

Quote
...When you take these facts and realize that it is a single-slot solution with one Molex connector...

This is in regards to the 6800GT model, not the 6800U reference sample that's been previewed by just about every review site so far.

3)  nVidia has announced a lower power spec than previously stated.  For details I recommend checking out THIS LINK.  It also addresses the functional requirements of a single molex connector vs two molex connectors on the 6800U model.[/color]
Logged


I fly into the night, on wings of fire burning bright...
games keeper
 

Elite
*
Posts: 1375

« Reply #4 on: 2004-05-15, 10:32 »

in that case , nice .

but still , keep the price down , keep the wattage down , keep the heat down , keep the size down

and all is fine for me .
Logged
OmEgA-X
 

Beta Tester
Hans Grosse
********
Posts: 281

« Reply #5 on: 2004-05-15, 21:52 »

that looks like what nvidia is tryin to do..they've always had their cards much cheaper than ATI's and IMO..nvidia drivers are more stable than ATI's catalyst drivers. ive also never had a geforce card overheat..so im w/nvidia all the way  Thumbs up!
Logged
ConfusedUs
 

Elite (2k+)
**
Posts: 2358

WWW
« Reply #6 on: 2004-05-15, 21:59 »

I'll go with the solution that offers the best bang for the buck.
Logged
Tekhead
 
Elite
*
Posts: 1110

« Reply #7 on: 2004-05-15, 22:47 »

I'll go with ATi, simply because right after nvidia raised the ante with a new video card release, ati almost immediately responded with a card that's on par with it. This tells me that ati's developing something schweet but incomplete, yet they've got the power to release a prototype to keep their competitors in their place =]
Logged
ConfusedUs
 

Elite (2k+)
**
Posts: 2358

WWW
« Reply #8 on: 2004-05-15, 23:06 »

ATI's been working on their next-gen cards just as long as nVidia has. They're due out soon, if I recall correctly.

Just like in the past, I'd expect them to release something at least on par with nVidia's stuff. If they put something out that was vastly inferior, it'd be the first nail in the coffin.

Remember 3DFX with the voodoo 4/5? They were nothing compared to nVidia. Shortly thereafter, 3DFX was no more, and nVidia bought them out.
Logged
Genialus
 

Shambler
*****
Posts: 119

« Reply #9 on: 2004-05-16, 13:38 »

Quote from: ConfusedUs
I'll go with the solution that offers the best bang for the buck.
I'm with you on that one. It won't be untill we see the finished cards we'll see who's really wearing the crown...
Logged
Phoenix
Bird of Fire
 

Team Member
Elite (7.5k+)
*********
Posts: 8805

WWW
« Reply #10 on: 2004-05-17, 04:57 »

"Wearing the crown" doesn't a king make.  Remember, revenues are where it's at, and the bulk of revenues rest not upon the "fastest" card, but in the mid-level range where most people can actually afford to buy something.  nVidia has always had a one-up on ATI there, and I think that trend will continue.  Sure, it's best to have the "fastest around", but what good is it if you can't make any money to continue to develop faster cards?  I hope both companies last a nice long time so this technology continues to advance.
Logged


I fly into the night, on wings of fire burning bright...
Angst
Rabid Doomer
 

Team Member
Elite
***
Posts: 1011

WWW
« Reply #11 on: 2004-05-17, 06:15 »

I try keeping it obvious where I stand here. ATI has been good to me, but ATI wouldn't be where they are if nvidia wasn't there to kick them in the ass.

MY concern is some of the marketing materials I've seen from nvidia recently. I haven't seen propaganda piled that thick since ww2 posters.

http://angs7.linesofsite.net/forum/propaganda.pdf

The above pdf was emailed to a friend of mine who works as director of marketing in a chain of computer stores. Granted, nvidia's new card kicks ass, but this was uncalled for.

edit: I recommend right-click saving the link as opposed to directly opening it.
« Last Edit: 2004-05-17, 06:22 by Angst » Logged

"Who says a chainsaw isn't a ranged weapon?"
Woodsman
Icon of Booze
 

Beta Tester
Icon of Sin
***********
Posts: 823

« Reply #12 on: 2004-05-17, 06:25 »

i dont think im the only one who tired of seeing "the way its ment to be played!" on half the games i buy.  Frankly im  a little tired of this whole video card cold war myself. Its gotten to the point where i cant see it as anything more than a dick measuring contest between rabid fanboys.

 and really the whole drivers argument is years out of date. i havent had a single driver problem   on my radeon 9700 pro since i got it  and that was nearly 2 years ago. I purchased an ati card because the performance on competing cards at the time didnt come within  leagues of it. ill be purchasing a new card around fall and should the Nvidia offering at the time offer more performance than its ATI counter part i will purchase it. I dont buy new hardware very often so when i do i try to get the nice stuff and as such the 100 or 200 dollar cards are not a big consideration for me but i have read reviews and i have to say the diffrence between the midrange ati and nvidia cards is so paper thin its almost not worth considering. (Radeon 9600 and the GeforceFX 5600) what im tired of seeing are these stupid "updates" to existing chipsets to gain back that 10% performance boost that the other card devolper stole from you the last time round. Or as i call them face saving cards. Really we dont need to see any more of this "GeforceFX5950superflyultra" or "radeon9800XTsexyedition" crap.
« Last Edit: 2004-05-17, 06:40 by Woodsman » Logged
Phoenix
Bird of Fire
 

Team Member
Elite (7.5k+)
*********
Posts: 8805

WWW
« Reply #13 on: 2004-05-17, 06:44 »

What's the problem with running an agressive marketing scheme?  Don't all companies do this?  Is this not expected in this day and age?  That advertisement sheet is not necessarily propaganda, although any advertising is propaganda when you think about it.  What's wrong with printing what's been said that is good about your product, as long as what's printed is accurate to what has been said?

I've read a lot of the reviews where this information was drawn from.  I've seen the clock frequency variances in ATI's hardware.  Sure, ATI's stuff has run faster on certain settings on some games, but like they stated, the card is ALREADY maxed out, can't be overclocked, and every test I've seen that's openGL vs D3D DOES put nVidia's card ahead.  I've seen the X800 get its ass handed to it on a platter on openGL games by the 6800, the same as I've seen ATI's card whip the 6800 around on higher FSAA and Anisotropic settings in D3D games.  Like Con, I'll buy whatever is best for the money, which I don't have at the moment, so that means neither.  There are strong sides and drawbacks to both cards.  It really depends on what you intend to do with the thing and how much dough you're willing to shell out.  I do find the technology advancement fascinating, and for those whining "I'm sick of all this, waa!" I say count your blessings. You could have been born 500 years ago where your "fun" would consist of sloshing through cow dung, tilling fields, and slaughtering pigs from sunup to sundown instead of sitting on your arses playing video games while someone in Taiwan slaves for $2.00 a week to make your fun little electronic toys.

I don't care about logos and fanboys myself.  They can slap whatever they want on a box or in a spash screen.  The box is just there until the game is opened, and the "Esc" key is there for a reason.  If you don't like it, don't buy the card, and don't buy the games.  To me it's that simple.
Logged


I fly into the night, on wings of fire burning bright...
Angst
Rabid Doomer
 

Team Member
Elite
***
Posts: 1011

WWW
« Reply #14 on: 2004-05-17, 07:00 »

Aggressive marketing is one thing, but the added statements eg :
"If you can't figure out how to support SM 3.0, then pretend it's not important!" or
"[Also it's the main extra feature on NV40 vs R420 so let's discourage people from using it until R5xx shows up with decent performance...]" Note the square brackets indicating a 'clarification .' I don't see accuracy here, I see a sarcastically twisted 'quote,' and I use that last term loosely. These are uncalled for imho. 75% of the material is simply slander. This is the kind of mudslinging I'd expect to see in politics. Unfortunately, it's becoming all-too-common in all walks of life; academia, technology, etc. As a salesman, I've often prided myself on selling a product based on it's QUALITY, not on ripping into my competition in order to gloss over any f-ckups of my own.
« Last Edit: 2004-05-17, 07:06 by Angst » Logged

"Who says a chainsaw isn't a ranged weapon?"
Phoenix
Bird of Fire
 

Team Member
Elite (7.5k+)
*********
Posts: 8805

WWW
« Reply #15 on: 2004-05-17, 07:35 »

And ATI is is not covering over it's own?  Slipgate - Confused
Logged


I fly into the night, on wings of fire burning bright...
Tabun
Pixel Procrastinator
 

Team Member
Elite (3k+)
******
Posts: 3330

WWW
« Reply #16 on: 2004-05-17, 13:44 »

Semi-:offtopic:
Personally, I've never understood the fanboyisms. Why be biased to one brand or another? They all make mistakes, they're all full of shit to some extent, and they sure as hell all apply near satanic marketing schemes. Whatever's best when I feel it's time to buy a card, is best. Whether that be ATI, or Nvidia, I couldn't give a crap. That said, I'm not an overclocker, and I'm not prepared to spend hundreds of bucks over a 5 FPS increase.
Hopefully, we won't ever get to the point where a monopoly on GPU development goes and spoils our fun.
« Last Edit: 2004-05-17, 13:45 by Tabun » Logged

Tabun ?Morituri Nolumus Mori?
games keeper
 

Elite
*
Posts: 1375

« Reply #17 on: 2004-05-17, 20:31 »

isnt that now already
( points to  half life 2 ==ati sponsored
oints to unreal tournament 2004 == nvidea sponsered )
Logged
Phoenix
Bird of Fire
 

Team Member
Elite (7.5k+)
*********
Posts: 8805

WWW
« Reply #18 on: 2004-05-19, 16:37 »

For the technically curious:

http://www.neoseeker.com/Articles/Hardware...rce6800/10.html

And a rather interesting quote from it:


Quote
Closing Thoughts

Neoseeker: How is NVIDIA dealing with the widening gap between memory technology and GPU technology? It seems like modern GPUs have been more memory bandwidth starved in a lot of cases rather than GPU instruction bound.

David Kirk: Actually, I believe that the trend is moving in the other direction. With the 6800, we are able to draw fullscreen 1600x1200 @ 60Hz and write pixels onto the screen about 60 times per frame. That seems like enough. I would rather use that extra bandwidth and the shader computation to do more interesting shading and processing. One of the great advantages of GPU architecture is that GPUs are highly parallel stream processors, capable of processing vast amounts of data, doing lots of floating point number crunching. Fortunately, graphics is an embarrassingly parallel problem, with lots of data to be processed. Actually writing out the results is becoming a smaller and smaller part of that task, and that's where the bandwidth mainly comes in.

60 raw passes in 1/60th of a second at 1280x1024  = 4,718,592,000 per-pixel calculations/second.  Four point seven TRILLION.  If you ask me, that's nothing to sneeze at right there, and that's just raw processing power.  Read the article, it is quite fascinating if you're interested in the direction computer graphics are going technologically, and what future games will be able to do.
Logged


I fly into the night, on wings of fire burning bright...
shambler
 
Icon of Sin
**********
Posts: 999

« Reply #19 on: 2004-05-19, 17:24 »

Reliability first

Proformance second

Price third

Nice box forth Slipgate - Laugh
Logged
Pages: [1] 2
  Print  
 
Jump to: