Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Greetings,
The real thing that'll bring down GPU mining is not newer GPUs. It's ASICs targetted specifically at the ASIC-resistant cryptocurrencies. Bitcoin has been beyond the realm of GPU mining for a while, but many of the newer coins are based off of Ethereum, and that has a RAM requirement that is harder to manage in ASIC form.
As Ethereum becomes more mineable by application-specific processors, we might see a pullback in the number of GPU devices being sold to miners. Or folks getting out of GPU mining before the ASICs come in, if they're aware of what's happening.
Sidenote: It doesn't... You could run the Iray Server on your same box to get the effect you're looking for. It just needs an IP address, and you can put your own (or 127.0.0.1, aka localhost, if the server was listening on it).
-- Morgan
And if the Vega Frontier was able to better compete with Nvidia then you would see Nvidia follow suit with a 16 GB card of their own closer to that price range. Instead, we got a Titan V that is freakin' $3000. So again, that is why competition is so vital. Just like how we can thank AMD for finally getting us past 4 cores in mainstream CPUs because of how well Ryzen competes against Intel. But AMD's Vega has struggled, in large part because AMD simply cannot manufacture these things. Plus the performance of Vega is nowhere near as groundbreaking as Ryzen.
So it bears repeating, the more AMD competes absolutely has a direct effect on what you can do with Daz Iray, even though Daz Iray cannot use AMD cards. This is why you want AMD to do well, to push Nvidia to do better. One would also hope that other rendering software would serve as competition for Iray to push it to improve as well. But that is not as likely (IMO,) gamers and now AI are the things Nvidia focuses on first. That Pascal did not ship with a working Iray driver is all the proof one needs to understand Nvidia's priorities. You think Nvidia would ship GPUs with no game drivers and expect gamers to wait several months to get them? LOL, such a farce would put them out of business.
Iray is going to have some serious shortcomings if it doesn't keep up with other render engines this year. That's also something I have been warning about for a really long time, but that's another topic.
...the issue with Nvidia is they will not compete against their more expensive pro grade Quadro line. This is part of why VRAM for the GTX and Titan lines is capped at 11 and 12 GB respectively. If they came out with a 16 GB GDDR6 GTX card even at the MSRP of a TitanXp, it would put a dent in sales of their 2,000$ Quadro P5000 and make the P4000 moot (actually both the standard 1080 and 1080 Ti do the latter). However, if funds were not a concern and I was given the choice between the P5000 and the Titan V I'd actually take the Quadro for the higher VRAM as again for my purposes, that is more important.
Part of the reason behind AMD's troubles with Vega Frontier is that HMB2 memory is in even shorter supply than standard VRAM (all Quadros save for the really 7,500$ GP100 still use GDDR5/5X).
Sadly as I have mentioned we are a small segment of the GPU buying public so our needs are not that high of a priority. Indeed if they did the same to gaming enthusiasts as they did to us with Pascal, it would have produced a major uproar. and possibly driven some to the other side as games are not dedicated to a specific "brand" of GPU like render engines are.
I already consider Iray to have major shortcomings compared to Octane and the forthcoming Octane 4. Either you need a big VRAM GPU which for many have become stupidly expensive these days, or are stuck for the most part with rendering in the slow on the CPU. Again with Octane's out of core mode you don't need an ultra high memory GPU card or have to spend time optimising to have improved rendering performance. If Nvidia faces any challenge in the near future it may not so much come more from AMD as from developers like Otoy since V4 will also support OpenCL as well as Intel integrated graphics, no longer making it "brand specific" like Iray.
...and at 20$ a month for the subscription path, I see it as a better more elegant solution to the situation.
Do you know how much VRAM your scenes actually require?
...more than the card I currently have as they more often than not dump to the CPU. I tend to prefer creating vast sweeping scenes with a high amount of detail and effects at large resolution sizes which comes from my painting days. Not into portraits or simple vignettes.
But you currently have 2GB? What makes you think your stuff wouldn't fit on 11GB? Log file should contain the actual amount that your scene required. Something like
IRAY rend stat : Texture memory consumption: 1.17012 GiB for 39 bitmaps (device 0)...4 GB, acquired an old Gigabit 750 Ti.
Also don't forget that the larger the resolution size, the more memory that is required.
The Titan V wasn't developed for the amateur 3D market, so technically, "we" didn't really get anything. It's meant for the science and AI researchers and they are getting a hellava bargain at $3000. "Our" card is due to hit this summer and then we can judge whether or not we are getting fleeced due to lack of competition. We can decide for ourselves if we want to spend the quid for a Titan V, but to complain that Nvidia is marketing an overpriced card to 3D users is a misunderstanding of where (or to whom) the card is aimed.
I ended up going the APU build route. Not regretting my decision thus far.
Just added another update in this saga, with the news that it is looking really good for those of us looking for video cards.. :)
You are misunderstanding my point. Are you telling me that if AMD was competing with the Titan V that it would still be $3000? No, it would not.
I'm not sure where you get this idea that Titans are exclusively for pro industries. The original Titans had GTX in their name, and Titans have been marketed as GTX members, and the card is sort of a hybrid between GTX and pro level series. And what is GTX? GTX is their gaming brand, is it not? Titans get gaming drivers. If Nvidia truly wanted to sell this as a pro card then it would have had a different name branding on it, and they have Quadro for this purpose. The Titan is their "pony car", but make no mistake, it is still a car. Not a truck, which is what a pro card would be compared to.
And again, history is on that side. Titans have sold for $1000, $1200, and $1500 in past iterations, which is not so out of reach for "us". These were not decades ago, so inflation is not the cause of this price. The Titan Z was $3000, but it was not received very well because of its price.
If you watch Nvidia's conference it will be quite clear what they are gearing towards science and research, and that is with Quadro, not Titan. There is no mention of Titan whatsoever at this conference.
For those folks interested in the article I has mentioned. I followed this guys build pretty much. Except even ECC used ram is up in price now.
http://www.beer30.org/building-a-32-thread-workstation-for-under-750/
If you look at the spec sheet for TitanV, you will see one unusual line that stands out as different from any other generation: Tensor cores. Now, what use would a non-scientist / AI researcher have for tensor cores? We don't use tensor in gaming or 3D rendering or anything outside a laboratory. So, should we choose to pay $3000 for the thing, please understand that we are paying extra for things that are useless in our 3D work and that responsibility lies completely on our laps, not Nvidia or AMD. The TitanV is a break from the past. A budget scientific research card. It even offers buyers free access to scientific software! Nvidia has no obligation to keep its Titan line consistent with any of our expectations. And to expect AMD to automatically have a rival card for this very new niche is not very realistic.
Aren't tensor cores extremely likely to be used in AI denoising?
So, the 32GB HBM2 Quadro GV100 was officially launched today, with a whopping $8.999 MSRP.
Doing a quick comparison of the specs between it and the P6000 (24GB GDDR5X), which has a $4,999 MSRP, shows a small bump in 32 bit compute, from 12 Tflops (P6000) to 14.8 Tflops (GV100), despite having a significantly higher core count.
Almost twice the money for an extra 33% of VRAM seems steep to me... even if it is say 23% faster... I'm struggling hard enough already trying to swallow maybe buying a $5K card price wise, and $9K is just, ugh!
----
Anyways, back on topic, I think AMD and NVidia will do just fine if/once the cryptominers move on to more specialized solutions. It'll take some time for those solutions to hit the market en masse, and there are a number of other segments that are chomping at the bit for GPU capacity (deep learning, etc.). Most of those segments are more NVidia oriented, of course. Plus, I think that both companies have been careful about ramping up their production, so while there may be a short term glut of cards flooding the market, well they've weathered that before.
It's not ugh for the people who it's designed for.. the cost/energy/space saving is worth it to enterprise level people, who's pockets are deeper than ours.. I know what's on my Christmas list this year :)
From anandtech, pertainent to the discussion.
"Aren't tensor cores extremely likely to be used in AI denoising?"
I have been following this line of thinking with a view to purchasing Voltas for a rendering rig. Since Redshift, Iray, and Octane all have working builds with AI denoising, my primary conclusion is that tensors will have no significant implications for the end user. However, tensor cores probably quicken the training of the denoiser but that is largely on the developer side before the product is shipped. Further training is done as more renders are made, which don't involve tensor cores. The basis for my assumptions is from the Redshift forums where users have already started using the TitanV and V100 in production work. Of course, if the final release of the AI denoiser allows tensor cores to speed up the render, I would be tickled pink. In the meantime, I have gone ahead with my purchase of several 1080ti's until everything shakes out.
...actually the GV100 compares more with it's predecessor the 16 GB GP100 in performance the latter which retails for around 7,000$.
Save for Single FP and boost clock performance the P6000 pretty much falls rather short.
Memory bus width, (10.6 x that of the P6000)
Error correction (full vs half)
FP32 performance (19.5 x that of the P6000)
FP64 performance of 118.5 TFlops (0 for the P6000)
More than one third additional CUDA cores.
..and NVLink support between cards at 200GB/s (vs. 2GB/s bandwidth for high bandwidth SLI) as well as allowing memory pooling for pure compute tasks (It still retains PCIe 3.0 x 16 as the system connection though).
..meanwhile back to the topic.
Interesting news. Time will only tell if Ethereum and Monero miners see the new ASICs as a more solid hardware investment and if the hit will be big enough to get GPU prices back in line.
At best I see this happening in Q1, maybe Q2 of next year. as miners will still look to squeeze as much return out of their investment in GPU cards before making the switch and disposing of them on the resale market. By then we will be looking at the successor to Pascal (whatever Nvidia plans to call it) most likely with a price bump due to the new technology along with continued shortage of memory chips and wafers.
I wouldn't expect to see a new 8 GB GTX card priced under 400$ like the 1070 was, with 11 maybe 12 GB remaining the VRAM cap for the the consumer line even given the boost to Nvidia's top end Quadro card (most likely the Volta replacement for the P5000 will continue to have 16 GB but be upgraded to HBM2 memory and have Tensor cores as well as NVLink compatibility along with a price increase).
Well, Monero already made a change in it's algorithm to avoid ASIC use, so there's no saying they can't do it again. And the Ethereum creators specifically said they want their currency mined on GPUs which everyone can afford, as opposed to only people rich enough to buy ASICs cornering the market. So it looks like they're prepared to do whatever it takes to prevent ASIC mining. But at least the ETH price is still dropping. It's at it's lowest since the surge started in December, and profitability is at an all time low. Prices should start to fall soon, hopefully.
I can't decide between getting a 1080ti now or waiting to see what the new cards look like. I don't know if they will be that big of a leap over the current cards. Plus how long after the xx70/xx80 release are we going to have to wait for Ti versions? Plus availability. Lots of frustrated gamers have been waiting to buy for about 7 months now. This could be a long summer.
All I know is that I'm so glad I broke down and bought a 1070 last year before the prices spiked. I just did a lookup on Amazon and the exct same model I bought for $339.99 last May now sells for $699.99 and even the used ones are selling for over $500. :o Instead of investing in stocks and real estate, I should have invested in NVDIA cards...
...what I mentioned was a "best case scenario" however I am still more inclined to agree with your outlook that ETH and Monero miners (as well as whatever new coins become "hot") will continue using GPUs. This indeed poses a bleak future for us for as card prices will continue to remain artificially high due to demand outpacing supply. This is why I feel the affordability window has been closed and will be for a while to come.
For the moment just biding my time working with 3DL until Octane 4 is released. Interestingly, in stepping back, I find I am enjoying this again, not having to wait countless hours to see the results.
You can???
I can't predict what the crypto guys are going to do. I know it's pretty much unprofitable to mine with just about any card as you can see here(you can plug in different cards and see the expected profit). So unless someone has free electricity, its unprofitable to mine Ethereum for now. The question is how many miners will stubbornly continue buying cards, hoping the ETH price goes back up. I'm not trying to be a Debbie Downer, I guess I'm just trying not to get my hopes up too high.
I just realized something new. I used to buy a $150 card every three or four years. Now I'm considering a $900 card. Is this going to be the new normal? I guess the technology is advancing. I can look at pretty pictures while I eat my peanut butter and jelly sandwiches and huddle around my pc for warmth. During the housing bubble, Home Depot was rumored to be introducing a home equity line of credit so you could shop in their stores and charge it directly to your HELOC. Maybe Nvidia will have to try that so we can afford their cards. Either that or we all work for them and get paid in cards.
I see a (tongue in cheek) option here: Nvidia introduces its own cryptocurrency, and if you opt in to mine for them you get a card at a reduced price (or a mining-based subscription, money back after every milestone achieved or something). What could possibly go wrong?
Of course, although it takes some work; export to Blender and use Cycles now; as soon as the AMD render works in blender, use that instead.
... I also saw a thread somewhere about someone making a plugin for AMD's renderer for Studio.
...not a being "Debbie Downer' at all. People who believe they can get what they think is "free money" will pursue it even if it ends up costing them more in the long run (look at chronic gamblers). To them the possibility of a big payout blinds them to the investment cost. I see this whole situation is digressing to little more than a new form of gambling, hoping that like what happened in late December another spike in value will occur (ETH also rose considerably as well just not to the ridiculous high that Bticoin did). Maybe the novelty will eventually wear off (of course I had hoped the same would have occurred in the case of SUVs, so much for that) and people will go back to their normal lives again.
We can only hope.
...yeah but in the proliferation we have today which began in the mid 1990s when every car company from BMW to VW to even "down to earth" Saturn began making them. Crikey while waiting for a bus in the city centre, I often see more SUVs (not counting pickups) than sedans these days.
Those old ones were at least practical, rugged, and designed to be work vehicles with the only "luxury" appointments being an AM radio and heat. Oh, and back then they were simply called "four wheel drives" or "utility vehicles". The term "Sport Utility" almost borders on being an oxymoron.
Most of today's are built for status and looks, not practicality. Take an Escalade where an old Land Rover would easily go and would look it went through a demo derby if it even survived the trip.