Please enable / Por favor activa JavaScript!
Por favor activa el Javascript![ ? ]

Review: Nvidia's GeForce GTX 660  Review: Nvidia's GeForce GTX 660

Valoración de éste post
4.82 / 5 de 23 votos



Mensajes: 1237


Puntos totales:

Enhorabuena!

32





Adequate, wow, this is awkward. You be aware of, we truly can't retain assembly like this. Each few weeks, it looks like, we're back in this same area, and I’m telling the equal story again. You recognize the one, where Nvidia has taken its strong Kepler GPU structure, shaved off just a few bits, and raised the performance ceiling for a decrease funds. By using now, you comprehend the way it ends: with me explaining that this new photos card delivers sufficient performance for many Americans and questioning why any individual would spend greater. You probably are expecting me to assert anything about how the competition from AMD is relatively first rate, too, besides the fact that children the Radeon's vigour draw is larger. With the aid of now, the script is attending to be relatively stable. Heck, I can see your lips relocating whereas I talk.

Well, pay attention up, chum. I’m no one's fool, and I’m no longer going to retain playing this identical list over and over once again, like Joe Biden on the DNC. i will be able to do issues, you comprehend. I should still be, I don’t know, explaining write coalescing within the Xeon Phi or modifying a multicast IP routing table someplace, no longer helping you lot come to a decision between a video card with 10 Xbox 360s value of rendering vigour and a further with 14. This second-expense web page can get a new spokes monkey.

Visita Exaforo.com


I am totally now not going to let you know in regards to the video card shown above, the GeForce GTX 660. That you can see from the picture that it's in keeping with the same reference design as the GeForce GTX 660 Ti and GTX 670. And when you have half a working lobe on your cranium, you recognize what's coming next: the expense is decrease, together with the efficiency. Appear, it be so simple as a number of key variables.

Baseclock(MHz) Boostclock(MHz) PeakROP rate(Gpix/s) Texturefilteringint8/fp16(Gtex/s) Peakshadertflops Raster-izationrate(Gtris/s) Memorytransferrate Memorybandwidth(GB/s) expense GTX 660 980 1033 25 eighty three/83 2.0 three.1 6.0 GT/s 144 $229.ninety nine GTX 660 Ti 915 980 24 one hundred ten/110 2.6 three.9 6.0 GT/s 144 $299.ninety nine GTX 670 915 980 31 a hundred and ten/a hundred and ten 2.6 3.9 6.0 GT/s 192 $399.ninety nine GTX 680 1006 1058 34 a hundred thirty five/135 3.3 four.2 6.0 GT/s 192 $499.99

You really don't need me for this. Versus the GTX 660 Ti, this ever-so "new" product is a tad slower in texture filtering, rasterization, and shader flops rates. And sure, that truly is a drop from 14 Xboxes worth of filtering power to 10. The ROP expense and memory bandwidth have not even modified, and yet the fee is down 70 bucks. This cost proposition doesn't contain tricky math.

Visita Exaforo.com


Heck, you likely do not even care that the cardboard has a blended-density reminiscence configuration with three 64-bit interfaces using 2GB of GDDR5 memory. Who must know about that for those who're Calling your responsibilities or prancing around on your fancy hats in TF2? All you are likely to agonize about are pedestrian concerns, like the indisputable fact that this card needs only 140W of vigour, so it requires just one six-pin energy input. I could let you know about its excessive-end points—such as aid for as much as four shows throughout three different input types, PCI specific three.0 switch costs, or two-method SLI multi-GPU teaming—however you'll probably forget about them two paragraphs from now. Why even bother?

A special chipYou know what's rich? This apparently pedestrian branding undertaking definitely contains new GPU silicon. they're calling this issue "GeForce GTX 660," but it surely is not based on the same chip as its purported sibling, the GeForce GTX 660 Ti. it's appropriate: the GTX 660 is based on the GK106 chip, not the GK104 half that we have been speak me about for months.

Visita Exaforo.com


Here is a smaller, reduce-down chip with fewer elements all through, as depicted within the block diagram above. The unit counts in that diagram are appropriate for the GTX 660, right down to that third GPC, or images processing cluster, with simplest a single SMX engine internal of it. Is that truly the GK106's full complement of instruments? Nvidia claims, and i quote, that the GTX 660 "uses the whole chip implementation of GK106 silicon." however I stay sceptical. I imply, analyse it. Really, a lacking SMX? i know more desirable than to have faith Nvidia. I've talked to Charlie Demerjian, Americans.

ROPpixels/clock Texelsfiltered/clock(int/fp16) ShaderALUs Rasterizedtriangles/clock Memoryinterfacewidth (bits) Estimatedtransistorcount(millions) Die size(mm²) Fabricationprocess node GF114 32 sixty four/sixty four 384 2 256 1950 360 40 nm GF110 forty eight 64/sixty four 512 four 384 3000 520 40 nm GK104 32 128/128 1536 four 256 3500 294 28 nm GK106 24 eighty/eighty 960 three 192 2540 214 28 nm Cypress 32 eighty/40 1600 1 256 2150 334 40 nm Cayman 32 96/forty eight 1536 2 256 2640 389 40 nm Pitcairn 32 80/40 1280 2 256 2800 212 28 nm Tahiti 32 128/64 2048 2 384 4310 365 28 nm.

With its 5 SMX cores, the GK106 has a complete of 960 shader ALUs (calling those ALUs "CUDA cores" is loopy advertising speak, like announcing a V8 engine has "eight motors"). Past that, look, the specs are within the desk, people. The simplest factor missing is the L2 cache amount, which is 384KB. (Note to self: trust adding L2 cache sizes to table in future.) You have got probably observed that the GK106 is just two rectangular millimetres better than the Pitcairn chip that powers the Radeon HD 7800 sequence. Severely, with this variety of parity, how am I imagined to conjure up drama for these reviews?

Visita Exaforo.com


I probably musn't tell you this, however considering that I've determined no longer to do a correct write-up, i could mean you can in on a little secret: that quarter is frickin' noted. Been the usage of the equal one for years, and or not it's everywhere the information superhighway, due to the fact that our photos are regularly, uh, "borrowed" by content material farms and such.





No te pierdas el tema anterior: ¿Cómo se actualizan los drivers de AMD?
Si has encontrado información útil en Exaforo, ayúdanos a seguir creciendo. Muchas gracias por confiar en nosotros!


Volver a Tarjetas gráficas


cron