Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
If there's no supply, there's not much profit. Trickle-up economics works just fine. Upgrade paths are a viable alternative for more people than the direct-upgrade path. Nvidia targets two specific markets: gamers with the GTX and brainiacs with the Quadro. Us wee folk on the rendered fringe using low-cost solutions are not "big players" in their ledgers. With the mining boom, they're exploring that market. Daz Studio users are not on their radar because there are a thousand hobbyists for every "getting paid to do it" artist. As well, a lot of us are also gamers, so we're in that market. Those few who are looking at building rendering rigs slowly and with as little lump-sum expense as possible are not paying their monthly bills by making huge purchases at once, but many streams combine to form a mighty river.
I still say their best move, and one that's easy for them to do, is build a GTX-based VCA box. It doesn't need 256GB of ECC RAM. Leave it user-configurable for CPU and standard RAM (single CPU or double, Xeon or i7, or whatever), and it doesn't have to run from a server. They can even cut it down so it only takes 4 GPUs, so long as you can add several to a single system. It'd be a lot cleaner than the Amfeltec GPU cluster, which is an electrical, fire, and trip hazard all at once.
Not to mention it'd be in a snazzy rackmount box.
If it requires heavy read/write operations to display while rendering, that can be solved by making Tesla-style render-only cards that dump out large buckets to the display card, rather than trying to do both at once. If the entire scene is dumped into the card's proposed SSDVRAM, then it should be a write-all-to-buffer-read-all-from-buffer process. Realtime rendering would not be an option, but I'll take limitless user-configurable headroom with timed updates over RT all day long.
Not really, presuming you're talking about IRAY.
Both have equal importance; why? 'Cause masses of CUDA cores with scenes that drop to CPU are sat in a very expensive paperweight. Of course, eventually scenes can get too large for the RAM, but the more RAM, the less hastle getting the scene to fit.
First determine what the majority of your scenes are RAM-wise; once this is know, look for cards that had at least this amount. Pick the one with most CUDAs, providing one can afford it, or are prepared to pay the cost - otherwise look at cheaper cards that fulfill needs.
You could as alternative, look for the most CUDAs, then deside if you can afford it; if can, does it have the RAM required? If both are yes, then buy.
Either method works.
Back in the older days of video cards, they DID have boards with sockets on them to upgrade the memory. The problem has to do with modern cards. Distance involved in having mechanical sockets introduces latency that memory only 1cm traces max from the GPU itself doesn't have. Second, at the current speeds, memory has to be cooled actively as well on a video card. This means even more space between and to socketed memory sticks on a video card, meaning even more latency. That means a slower card.
nVidia and ATI/AMD both were forced to go to a strict on-board memory model in order to get the speeds/cooling to stay competitive. When a gamer wants to wring the highest fps for his money out of a card, he'll pay for a non-upgradable card with faster performance.
The old ATI Rage16 (IIRC) and other contemporary cards often had memory sockets for additional upgrades. Back then, Video Cards didn't HAVE fans and only the GPU needed a passive heat sink on the higher-speed models.
~
You said a mouthful, didn't you.
Agreed, amateur animators are not much of a target market, and yes, there is an artist on every streetcorner, as has always been the case. I do acknowledge that "gaming" (I don't think the name tells the whole story) is the primary thing keeping the desktop PC alive, and I am glad something is. It's worth keeping in mind, though, that an awful lot of product decisions today are made not by futurists or technologists or even by insightful users but by myopic individuals with little or no technical knowledge or understanding.
I've never understood all those people who predicted the 'death' of the PC. I guess they're just trying to sell newspapers or web page clicks or whatever. In most cases, anyone who does any real work is using a PC. I mean, you probably could code a web page or prepare someones taxes using an iPad, but I don't think it's optimal. Between office workers, scientists, the tech industry and hobbyists (including gamers) the PC is safe from extinction. Sure, the average person may not own a computer anymore, but for me that just means I don't get so many calls from family and friends asking me to fix their computers.
~
Agreed that the so-called pundits often stand too close to the spreadsheet to see reality. However, the growth of remote servers continues, and a desktop PC doesn't fit in your pocket. The PC as we know it could become a niche market. If it isn't already.
I was in a computer store the other day buying an SD card for my camcorder and there was a guy in front of me with a dozen Nvidia graphics cards. When he left I asked the cashier "bitcoin mining" and he said, "yeah, he comes in every week and buys every video card we've got and then bitches at us because we don't have more".
I think as long as that's happening you're not going to see much relief in the price of graphics cards.
OK, so GPU's are getting snatched up about immediately, reportedly there's a memory supply shortage too, and now there's also a wafer shortage?!?
http://www.guru3d.com/news-story/wafer-shortage-now-also-pushes-prices-upwards.html
We can't seem to catch a break guys!
On a side note, I just saw on a Dutch news site that the police arrested someone for laundering money and illegally drawing off electricity. Usually in these cases they find a large amount of pot being grown on site. This one had, according to the police, tens of computers mining cryptocurrency (and looking at the photos, those are server racks, so a large-ish operation).
You reminded me that some time ago I have found an article about a big investigation in Japan about 56 millions dollars stolen from a Bitcoin-related company by hackers...
I wonder how Miners can spend REAL money to get a bunch of bytes that can be lost like a corrupted JPG image or simply "copied" to another hard disk by anyone with the "right knonlwdge", since the """"money""" is stored in a local drive.
That shortage scares me a lot. That doesn't mean only higher prices for GPUs but also for anything that uses silicon to be built... like CPUs and moterboards...
All money only has value because folks believe it does.
Bitcoin is no different; if that belief falters, then the price goes down by a fair bit - and potentially collapses if enough folks lose faith.
I bought a refurbed workstation off ebay with 96 GB RAM in it just for large scenes. Honestly... You can get into a box like it for less than a 1080 ti. I run Iray on it when I have a scene that won't fit into my TITAN X's.
Just hit ebay and search for 96 GB ram under Computer. I like the HP Z800 and Lenovo D30.
Well, by that argument, maybe national economies will have a breakdown and regular money will lose value, the world market sustained only by bitcoin...
Greetings,
So...I want to address a few things, but it's likely to be boring as heck for folks, so feel free to skip.
I seem to be saying this all-too-often recently (for almost everything in life), but it's not that simple. First off, they typically hold their BTC money in exchanges, which allows them to convert it between USD and BTC at will. If you mine some BTC, you don't typically keep it locally, you transfer it to the rough equivalent of a 'bank'. The main difference is that it's just a string of bits and a set of encryption keys, not a name, social security number, address, etc., etc... You CAN keep it locally, but then you encrypt it, back it up, and keep the encryption key offline, and maybe even two-factored (something you have and something you know), so that if someone copies your bits, they can't access the currency to transfer it away (i.e. 'take' it).
It's not hard to spend 'real' money ('fiat currency'; it has value because a government says it has value) when you can arbitrarily convert between BTC and fiat currency. If your mining rigs have generated 1BTC, you can get an exchange to convert that into a few thousand USD easily, and (1) pay your electric bill, and (2) buy four top-of-the-line GPUs, and still have money left over to fly to Disneyland and let your rig make money for you (literally) while you enjoy yourself.
The thing is that folks 'mining' BTC actually helps BTC because the work that their computers are doing isn't just random work; it's signing (cryptographically) the last set of transactions (payments between BTC accounts) that happened. That's what all those GPUs are doing out there, they're agreeing across the world that a set of transactions happened, and signing them. In exchange, they get 'paid' for the proof-of-work they're doing in BTC.
Yep. The big question in the VERY early days was, 'Will people actually start accepting this as payment?' The answer was a shockingly strong, 'Yes!' I can pay my hosting bill in BTC, for example. Yes, there's a lot of folks who use it as a digital cash substitute for all the less-than-savory things that cash is usually used for. But there's a lot MORE people who use it for everyday things because it's...like having cash, but on the internet. It's the Science Fiction-y 'credits', pretty much literally.
I sound a lot more rah-rah than I am; I'm terribly aware of just how...fragile the whole thing is. The algorithm is (pardon my language) F**KING BRILLIANT, and how everything...hangs together and reinforces itself is absolutely AMAZING. Imagine the most beautiful artwork you've ever seen; the original BTC paper and implementation is the equivalent for computer science, and the author is (!) anonymous. The dangerous part is the actual mining of BTC (and it's called out in the abstract for the paper). If mining DOES crash, then the distributed number of folks signing transactions will drop, which puts more power in the hands of the super-large BTC mining organizations, which could cooperate to attack the chain. This could lead to a lack of faith in the transaction history (the 'blockchain'), which would lead to a loss of faith in BTC itself, which...as we've described...means that the currency will lose value.
I'm not invested in BTC at all, I just stand in flat awe of the intellect it took to come up with it all, and have it work at global scale, first time out.
-- Morgan
Not true, real world currencies are based on state's gold possessions. For example, the state of Zumpabalubba (It doesn't exist for real! 0.0 ) owns 10 millions worth gold in its deposits, so it can emitt 10 million of Zumpabalubbian Dollars. The loss of "power" of the currency depends on many factors, but still the state got those 10 millions gold bars to cover the money value. Someone told me it works this way.
Bitcoins are based on thin air. What pricey mineral covers its values? Silicon, maybe?
Anyway, even if I don't want Miners to become beggars, I hope this bad situation will end soon. My workstation is giving up, my GPU is begging for mercy...
That's the way to do it! I gradually bought surplus parts to build several workstations similar to the ones you mentioned. Besides the vendors on Ebay, there are a couple that sell through Newegg too (might be the same ones). When I checked, they would custom build/refurbish, and supply a warranty, for less than the current cost for parts or, as you said, a 1080ti or even a 1080. Now even the XEON E5-2680V2 10-core cpus are pretty cheap, the E5-2670/2680 V1 8-cores are dropping in price, and the older X56xx dual six-core CPUs and MBs are going for literally pennies on the dollar. I planned these so that I could have enough crunch power to run VUE render and Carrara/Grid but they will do fine for IRAY too. Now I just gotta get them built....
Somebody told you wrong. Money is not based on a countries gold (it sort of used to be a long time ago) that said real currencies tend to be much more stable than cryptocurrencies because they have the stability of a government saying "we think this is worth stuff and support it" as long as the government remains stable and doesn't start printing money out the wazoo the currency will remain relatively stable. Cryptocurrencies don't have that same stable base and so are much more prone to fluctuations in value.
Sidenote on gold, it's value isn't really all that intrinsic either as it's basis is a mix of scarcity and people liking shiny things (also some tech applications but those are a recent development in the grand scheme of things) which is why I find the subset of people who want to go back to the gold standard so weird. Also weird: the political overlap of people who like the gold standard and people who like cryptocurrencies. (I mean I do actually get it I just find it funny)
One big reason people got away from gold standard is because it sets up a situation where a few wealthy bankers could manipulate value very easily by sitting on assets.
Mind you, they can STILL do that, but the problem was particularly egregious before.
One of the appealing things about Bitcoin is the idea of a distributed responsibility. Instead of governments directly managing the value of some currency, it's more diffuse.
Which has its upsides and downsides; while I'm a bit iffy on Bitcoin myself, I'm growing to think a lot of the dismissal of Bitcoin comes from people not really facing how baloney our other money systems are, too. ;)
Money hasn't been based on gold (at least in the US) for decades.
Laurie
And I'm fine with that. I grew up with a Commodore 64 and I'm sure there are some folks here who would call me a young whippersnapper. We've been through tough times before and we'll get through this. People can switch to their cell phones and Netflix and VR goggles or whatever, but to make all that stuff work, somebody has to program something on a PC. So as long as companies are making PCs for industrial and engineering use, they can sell to the home market too, it just won't be their main priority anymore.
When PCs were mainstream, we gamers, hobbyists and enthusiasts could walk the streets like normal humans. The 'normals' even learned our language, using words like PCI, baud or CPU, often incorrectly, but they made the effort. We no longer had to endure questions like "You have a computer? What for?" Now that winter has fallen upon us, we may have to retreat back to our attics and basements, but we'll always remember that short decade in the sun, when nerds were accepted, and dare I say admired.
Ok maybe that's pushing it.
Greetings,
^^^ This.
The truth is that if the US government (or other large governments) wanted to destablize BTC, they could through the application of massive amounts of computing. Mind you it would take the equivalent of two large nuclear power plants to provide power for it, but it's absolutely doable, and it would take less than a year to destablize the currency sufficiently.
The comment about gold is why I specifically call out what we think of as 'real' currencies as 'fiat' currencies; they exist, and have value, because the government backing them says they have value, and people have faith in the government. Not just you and I, but financiers across the world. They invest in that currency (Yuan, Yen, Dollar, Euro, etc.) because they trust the governments behind them not to print arbitrarily large amounts of it and to pay their bills.
This is far afield of the problem of nVidia GPUs not being available. In the end if a cryptographic solution is strong enough and easy enough to explain and implement, and remains secure, it gets the benefit of that same 'faith' that it's not going to print arbitrarily large quantities (because the amount available is limited algorithmically), and that it's not going to go away (because transactions are being done in it). As long as that faith is in place, we've built an international network that makes transacting in digital currencies very, very easy.
...and thus, unfortunately, I don't see cryptocurrencies going away any time soon, and equivalently the hunger for GPUs. I believe the ultimate outcome will be that nVidia will ramp up production for a while, until a more tailored solution becomes available, at which point they'll gratefully settle back into being a producer for gaming cards.
...
It's worth noting, though, that rendering is a process very similar to calculating hashes, and any hardware that gets better at cryptomining _could_ in theory be repurposed to improve rendering. This might be a 'darkest before the dawn' thing. If there's a boom in massively parallel compute units in order to sate the cryptocurrency miners, it's possible it could throw off other interesting technologies that we can use for rendering.
-- Morgan
...exactly, a mining rig in a convenient box.
It would also be prohibitively expensive for most folks. If it were built on the same platform as the current VCA (which has 8 Quadro P6000s) using 1080 Ti's instead, even at the original base cost per card you are looking at 5,600$ just for the cards alone. You would still need a CPU, some memory, and the interconnects as well. It also would only give you 11 GB of VRAM for rendering as like I mentioned, VRAM does not pool for that process, so the only advantage is that you have 28,672 cores to speed the process up or batch process multiple render jobs at the same time.
...
...I've even found 128 GB dual xeon workstation servers with old Quadro cards (basically with enough memory to run the displays) for about what I paid to build my system.