OT Update 2: Nvidia & AMD about to lose a lot of sales from cryptominers?

1679111218

Comments

  • Rottenham said:
    Taoz said:

    I'm not sure how VRAM works on a video card, but if there is a lot of constant read/write operations SSD wouldn't last long. Besides, VRAM is a lot faster than SSD, and they're making it faster and faster, which there must be a reason for. But maybe that's only relevant for gaming, and not still renders.

    This article says, "VRAM is a special type of DRAM which is dual-ported. It still has a duty cycle, but it can write to and read from at the same time. In practice, this means that you get double the bandwidth out of 60 ns VRAM as you would out of 60 ns DRAM (if implemented correctly on the video card)."

    That said, it is just a little chip. It could be soldered on an M.2 card (or some such) and plugged into anything - a video card, for example. The video driver would never know the difference.

    I can't believe this hasn't been thought of and discussed at AMD and Nvidia. My guess is, the profits are much greater in selling you an entire replacement video card than allowing you to upgrade it. Soldering components in rather than socketing them is always cheaper to begin with. Why mess with success? They can't make the stuff they have now fast enough.

    If there's no supply, there's not much profit. Trickle-up economics works just fine. Upgrade paths are a viable alternative for more people than the direct-upgrade path. Nvidia targets two specific markets: gamers with the GTX and brainiacs with the Quadro. Us wee folk on the rendered fringe using low-cost solutions are not "big players" in their ledgers. With the mining boom, they're exploring that market. Daz Studio users are not on their radar because there are a thousand hobbyists for every "getting paid to do it" artist. As well, a lot of us are also gamers, so we're in that market. Those few who are looking at building rendering rigs slowly and with as little lump-sum expense as possible are not paying their monthly bills by making huge purchases at once, but many streams combine to form a mighty river.

    I still say their best move, and one that's easy for them to do, is build a GTX-based VCA box. It doesn't need 256GB of ECC RAM. Leave it user-configurable for CPU and standard RAM (single CPU or double, Xeon or i7, or whatever), and it doesn't have to run from a server. They can even cut it down so it only takes 4 GPUs, so long as you can add several to a single system. It'd be a lot cleaner than the Amfeltec GPU cluster, which is an electrical, fire, and trip hazard all at once.
    Not to mention it'd be in a snazzy rackmount box.

  • kyoto kid said:
    Imago said:
    Kitsumo said:

    Yep, someone covered it earlier https://www.daz3d.com/forums/discussion/comment/3396241/#Comment_3396241 . Renders take a little longer to start (copying textures across a PCI-E x1 bus instead of x8 or x16) but after that it rendered normally. I'm going to try it once I get a new card. The only problem is finding someplace to mount your cards.

    Ah, I see!

    It looks like the increase isn't that much... Well, in the end it was really a "download more RAM" thing! cheeky

    Anyway I hope GPU prices will drop soon. I don't want the miners go bankrupt due to sudden failure of cryptomoney... I just want to buy a piece of techology at its right price!

    If by "download more RAM" you mean "more GPUs means more VRAM", such is not the case, sadly. As I said, the Titan Z was two 6GB GPUs on one board, advertised as 12GB. Where Iray is concerned, it's two 6GB cards. The 2880x2 cores stack, but Iray only sees 6GB of VRAM. Even with a Titan X Pascal, a 1080ti, a Titan Z, a 980 (4GB), and 2 780ti (3GB), if my scene goes over 3GB, it dumps to the CPU.

    If Nvidia wanted to make a "targeted market" GPU series, it should be a GTX-priced line that had the capability to stack VRAM. No outputs, just processing, and a more flexible way to join them than a hard plastic SLI connector. They could even go so far as to make a GTX-based VCA in 4, 6, or 8 GPU configurations that could be daisy-chained for future expandability, and that would report to the OS as a single GPU to get around Windows' GPU count limit (without having to disable resources in Device Mismanager).

    If they really wanted to get ahead of the pack, they'd drop embedded VRAM entirely and use user-swappable SSDs, so long as their transfer rate was comparable. You could buy a 2GB card to save a few bucks up front, then attach a couple of 6GB SSDs to add 12GB more at a later time. This takes the pressure off them to keep the market supply up, and cuts out the headache and heartache of end-users who scrimp and save for a better card only to find the price has gone up another $200+ by the time they're ready to buy.

    It'd be nice if AMD cards were suddenly way better and cheaper for mining than Nvidia cards :P

     

    ...not sure if it is possible to pool VRAM for rendering (doesn't even work with Quadros).  It can be done through NVLink for the latest Volta Tesla cards but that is strictly for compute operations only.

    I agree with Taoz about using SSDs for such heavy duty read/write operations.

    As to VRAM, I believe the next generation (20xx) cards will have GDDR6 from what I have read.  HBM 2 is still more expesnive than conventional VRAM, so that may only be seen on the upper end Quadro cards. (The Volta GV-100 has 16 GB of HBM2 but costs around 6,500$ - 7,000$)

    If it requires heavy read/write operations to display while rendering, that can be solved by making Tesla-style render-only cards that dump out large buckets to the display card, rather than trying to do both at once. If the entire scene is dumped into the card's proposed SSDVRAM, then it should be a write-all-to-buffer-read-all-from-buffer process. Realtime rendering would not be an option, but I'll take limitless user-configurable headroom with timed updates over RT all day long.

  • nicsttnicstt Posts: 11,715
    kyoto kid said:

    Current rumors are the Nvidia is going to cash in on the market prices with some much higher prices on the 2000 series. The flagship successor to the 1080ti could be well over $1000, even $1500.

    https://wccftech.com/rumor-nvidias-gtx-2080-flagship-graphics-cards-will-be-priced-significantly-upwards-of-699-msr-up-to-1500/

    But nobody really knows what Nvidia is doing. I'm not sure Nvidia really knows what they are doing, LOL. They were originally going to have the new cards out in next month or so, but now the speculation is later in the year. Pascal is supposedly at the end of its life, meaning production is supposed to have stopped. But then how are they getting new stock out? 

    My Guess is with 1080s going so wild, it would be easy to charge $700 or more MSRP for a 2080. But this would be a huge risk, too. If the mining boom was to suddenly die around the launch period, prices would bottom out and the high price on the 2080 would backfire.

    ...1,500$?  Crikey, almost better off getting a P5000 and having 5 extra GB along with the ability to sidestep W10 WDDM even if it has fewer cores.

    Yet again, with Iray, Core Count outranks VRAM, unless you're building 10+ GB scenes. A Titan Z will do a lot as long as you keep it under 6GB total. Unless the new one has been released, it's still the highest-core double-decker available. Even if the new double-Titan has been released, the Titan Z is still cheaper.

     

    Not really, presuming you're talking about IRAY.

    Both have equal importance; why? 'Cause masses of CUDA cores with scenes that drop to CPU are sat in a very expensive paperweight. Of course, eventually scenes can get too large for the RAM, but the more RAM, the less hastle getting the scene to fit.

    First determine what the majority of your scenes are RAM-wise; once this is know, look for cards that had at least this amount. Pick the one with most CUDAs, providing one can afford it, or are prepared to pay the cost - otherwise look at cheaper cards that fulfill needs.

    You could as alternative, look for the most CUDAs, then deside if you can afford it; if can, does it have the RAM required? If both are yes, then buy.

    Either method works.

  • hphoenixhphoenix Posts: 1,335
    Rottenham said:
    Taoz said:

    I'm not sure how VRAM works on a video card, but if there is a lot of constant read/write operations SSD wouldn't last long. Besides, VRAM is a lot faster than SSD, and they're making it faster and faster, which there must be a reason for. But maybe that's only relevant for gaming, and not still renders.

    This article says, "VRAM is a special type of DRAM which is dual-ported. It still has a duty cycle, but it can write to and read from at the same time. In practice, this means that you get double the bandwidth out of 60 ns VRAM as you would out of 60 ns DRAM (if implemented correctly on the video card)."

    That said, it is just a little chip. It could be soldered on an M.2 card (or some such) and plugged into anything - a video card, for example. The video driver would never know the difference.

    I can't believe this hasn't been thought of and discussed at AMD and Nvidia. My guess is, the profits are much greater in selling you an entire replacement video card than allowing you to upgrade it. Soldering components in rather than socketing them is always cheaper to begin with. Why mess with success? They can't make the stuff they have now fast enough.

    Back in the older days of video cards, they DID have boards with sockets on them to upgrade the memory.  The problem has to do with modern cards.  Distance involved in having mechanical sockets introduces latency that memory only 1cm traces max from the GPU itself doesn't have.  Second, at the current speeds, memory has to be cooled actively as well on a video card.  This means even more space between and to socketed memory sticks on a video card, meaning even more latency.  That means a slower card.

    nVidia and ATI/AMD both were forced to go to a strict on-board memory model in order to get the speeds/cooling to stay competitive.  When a gamer wants to wring the highest fps for his money out of a card, he'll pay for a non-upgradable card with faster performance.

    The old ATI Rage16 (IIRC) and other contemporary cards often had memory sockets for additional upgrades.  Back then, Video Cards didn't HAVE fans and only the GPU needed a passive heat sink on the higher-speed models.

  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • If there's no supply, there's not much profit. Trickle-up economics works just fine. Upgrade paths are a viable alternative for more people than the direct-upgrade path. Nvidia targets two specific markets: gamers with the GTX and brainiacs with the Quadro. Us wee folk on the rendered fringe using low-cost solutions are not "big players" in their ledgers. With the mining boom, they're exploring that market. Daz Studio users are not on their radar because there are a thousand hobbyists for every "getting paid to do it" artist. As well, a lot of us are also gamers, so we're in that market. Those few who are looking at building rendering rigs slowly and with as little lump-sum expense as possible are not paying their monthly bills by making huge purchases at once, but many streams combine to form a mighty river.

    You said a mouthful, didn't you. smiley Agreed, amateur animators are not much of a target market, and yes, there is an artist on every streetcorner, as has always been the case. I do acknowledge that "gaming" (I don't think the name tells the whole story) is the primary thing keeping the desktop PC alive, and I am glad something is.  It's worth keeping in mind, though, that an awful lot of product decisions today are made not by futurists or technologists or even by insightful users but by myopic individuals with little or no technical knowledge or understanding.

     

  • Rottenham said:
    Taoz said:

    I'm not sure how VRAM works on a video card, but if there is a lot of constant read/write operations SSD wouldn't last long. Besides, VRAM is a lot faster than SSD, and they're making it faster and faster, which there must be a reason for. But maybe that's only relevant for gaming, and not still renders.

    This article says, "VRAM is a special type of DRAM which is dual-ported. It still has a duty cycle, but it can write to and read from at the same time. In practice, this means that you get double the bandwidth out of 60 ns VRAM as you would out of 60 ns DRAM (if implemented correctly on the video card)."

    That said, it is just a little chip. It could be soldered on an M.2 card (or some such) and plugged into anything - a video card, for example. The video driver would never know the difference.

    I can't believe this hasn't been thought of and discussed at AMD and Nvidia. My guess is, the profits are much greater in selling you an entire replacement video card than allowing you to upgrade it. Soldering components in rather than socketing them is always cheaper to begin with. Why mess with success? They can't make the stuff they have now fast enough.

    If there's no supply, there's not much profit. Trickle-up economics works just fine. Upgrade paths are a viable alternative for more people than the direct-upgrade path. Nvidia targets two specific markets: gamers with the GTX and brainiacs with the Quadro. Us wee folk on the rendered fringe using low-cost solutions are not "big players" in their ledgers. With the mining boom, they're exploring that market. Daz Studio users are not on their radar because there are a thousand hobbyists for every "getting paid to do it" artist. As well, a lot of us are also gamers, so we're in that market. Those few who are looking at building rendering rigs slowly and with as little lump-sum expense as possible are not paying their monthly bills by making huge purchases at once, but many streams combine to form a mighty river.

    I still say their best move, and one that's easy for them to do, is build a GTX-based VCA box. It doesn't need 256GB of ECC RAM. Leave it user-configurable for CPU and standard RAM (single CPU or double, Xeon or i7, or whatever), and it doesn't have to run from a server. They can even cut it down so it only takes 4 GPUs, so long as you can add several to a single system. It'd be a lot cleaner than the Amfeltec GPU cluster, which is an electrical, fire, and trip hazard all at once.
    Not to mention it'd be in a snazzy rackmount box.

    The problem with this is the same as with the current situation; while it might be targeted at the artistic folks, if it can be made to mine, you can bet your next purchase cost here that they would snap them up like they are doing with video cards now.
  • KitsumoKitsumo Posts: 1,221
    Rottenham said:

    If there's no supply, there's not much profit. Trickle-up economics works just fine. Upgrade paths are a viable alternative for more people than the direct-upgrade path. Nvidia targets two specific markets: gamers with the GTX and brainiacs with the Quadro. Us wee folk on the rendered fringe using low-cost solutions are not "big players" in their ledgers. With the mining boom, they're exploring that market. Daz Studio users are not on their radar because there are a thousand hobbyists for every "getting paid to do it" artist. As well, a lot of us are also gamers, so we're in that market. Those few who are looking at building rendering rigs slowly and with as little lump-sum expense as possible are not paying their monthly bills by making huge purchases at once, but many streams combine to form a mighty river.

    You said a mouthful, didn't you. smiley Agreed, amateur animators are not much of a target market, and yes, there is an artist on every streetcorner, as has always been the case. I do acknowledge that "gaming" (I don't think the name tells the whole story) is the primary thing keeping the desktop PC alive, and I am glad something is.  It's worth keeping in mind, though, that an awful lot of product decisions today are made not by futurists or technologists or even by insightful users but by myopic individuals with little or no technical knowledge or understanding.

     

    I've never understood all those people who predicted the 'death' of the PC. I guess they're just trying to sell newspapers or web page clicks or whatever. In most cases, anyone who does any real work is using a PC. I mean, you probably could code a web page or prepare someones taxes using an iPad, but I don't think it's optimal. Between office workers, scientists, the tech industry and hobbyists (including gamers) the PC is safe from extinction. Sure, the average person may not own a computer anymore, but for me that just means I don't get so many calls from family and friends asking me to fix their computers.

  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • Kitsumo said:

    I've never understood all those people who predicted the 'death' of the PC. I guess they're just trying to sell newspapers or web page clicks or whatever. In most cases, anyone who does any real work is using a PC. I mean, you probably could code a web page or prepare someones taxes using an iPad, but I don't think it's optimal. Between office workers, scientists, the tech industry and hobbyists (including gamers) the PC is safe from extinction. Sure, the average person may not own a computer anymore, but for me that just means I don't get so many calls from family and friends asking me to fix their computers.

    Agreed that the so-called pundits often stand too close to the spreadsheet to see reality. However, the growth of remote servers continues, and a desktop PC doesn't fit in your pocket. The PC as we know it could become a niche market. If it isn't already.

  • ghastlycomicghastlycomic Posts: 2,531

    I was in a computer store the other day buying an SD card for my camcorder and there was a guy in front of me with a dozen Nvidia graphics cards. When he left I asked the cashier "bitcoin mining" and he said, "yeah, he comes in every week and buys every video card we've got and then bitches at us because we don't have more".

    I think as long as that's happening you're not going to see much relief in the price of graphics cards.

  • GreymomGreymom Posts: 1,140

    I was in a computer store the other day buying an SD card for my camcorder and there was a guy in front of me with a dozen Nvidia graphics cards. When he left I asked the cashier "bitcoin mining" and he said, "yeah, he comes in every week and buys every video card we've got and then bitches at us because we don't have more".

    I think as long as that's happening you're not going to see much relief in the price of graphics cards.

    angry  Some bars still have signs "We don't server minors."   Maybe it's time for computer stores to put up "We don't server cryptominers".

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    OK, so GPU's are getting snatched up about immediately, reportedly there's a memory supply shortage too, and now there's also a wafer shortage?!?

    http://www.guru3d.com/news-story/wafer-shortage-now-also-pushes-prices-upwards.html

    We can't seem to catch a break guys!

  • NathNath Posts: 2,942

    On a side note, I just saw on a Dutch news site that the police arrested someone for laundering money and illegally drawing off electricity. Usually in these cases they find a large amount of pot being grown on site. This one had, according to the police, tens of computers mining cryptocurrency (and looking at the photos, those are server racks, so a large-ish operation).

  • ImagoImago Posts: 5,666
    Nath said:

    On a side note, I just saw on a Dutch news site that the police arrested someone for laundering money and illegally drawing off electricity. Usually in these cases they find a large amount of pot being grown on site. This one had, according to the police, tens of computers mining cryptocurrency (and looking at the photos, those are server racks, so a large-ish operation).

    You reminded me that some time ago I have found an article about a big investigation in Japan about 56 millions dollars stolen from a Bitcoin-related company by hackers...

    I wonder how Miners can spend REAL money to get a bunch of bytes that can be lost like a corrupted JPG image or simply "copied" to another hard disk by anyone with the "right knonlwdge", since the """"money""" is stored in a local drive.

     

    OK, so GPU's are getting snatched up about immediately, reportedly there's a memory supply shortage too, and now there's also a wafer shortage?!?

    http://www.guru3d.com/news-story/wafer-shortage-now-also-pushes-prices-upwards.html

    We can't seem to catch a break guys!

    That shortage scares me a lot. That doesn't mean only higher prices for GPUs but also for anything that uses silicon to be built... like CPUs and moterboards...

  • nicsttnicstt Posts: 11,715
    Imago said:
    Nath said:

    On a side note, I just saw on a Dutch news site that the police arrested someone for laundering money and illegally drawing off electricity. Usually in these cases they find a large amount of pot being grown on site. This one had, according to the police, tens of computers mining cryptocurrency (and looking at the photos, those are server racks, so a large-ish operation).

    You reminded me that some time ago I have found an article about a big investigation in Japan about 56 millions dollars stolen from a Bitcoin-related company by hackers...

    I wonder how Miners can spend REAL money to get a bunch of bytes that can be lost like a corrupted JPG image or simply "copied" to another hard disk by anyone with the "right knonlwdge", since the """"money""" is stored in a local drive.

    All money only has value because folks believe it does.

    Bitcoin is no different; if that belief falters, then the price goes down by a fair bit - and potentially collapses if enough folks lose faith.

  • GazukullGazukull Posts: 96

    I bought a refurbed workstation off ebay with 96 GB RAM in it just for large scenes.  Honestly... You can get into a box like it for less than a 1080 ti.  I run Iray on it when I have a scene that won't fit into my TITAN X's.  

    Just hit ebay and search for 96 GB ram under Computer.  I like the HP Z800 and Lenovo D30.  

  • Oso3DOso3D Posts: 15,085
    nicstt said:

    All money only has value because folks believe it does.

    Bitcoin is no different; if that belief falters, then the price goes down by a fair bit - and potentially collapses if enough folks lose faith.

    Well, by that argument, maybe national economies will have a breakdown and regular money will lose value, the world market sustained only by bitcoin...

     

  • CypherFOXCypherFOX Posts: 3,401

    Greetings,

    So...I want to address a few things, but it's likely to be boring as heck for folks, so feel free to skip.

    nicstt said:
    Imago said:

    I wonder how Miners can spend REAL money to get a bunch of bytes that can be lost like a corrupted JPG image or simply "copied" to another hard disk by anyone with the "right knonlwdge", since the """"money""" is stored in a local drive.

    I seem to be saying this all-too-often recently (for almost everything in life), but it's not that simple.  First off, they typically hold their BTC money in exchanges, which allows them to convert it between USD and BTC at will.  If you mine some BTC, you don't typically keep it locally, you transfer it to the rough equivalent of a 'bank'.  The main difference is that it's just a string of bits and a set of encryption keys, not a name, social security number, address, etc., etc...  You CAN keep it locally, but then you encrypt it, back it up, and keep the encryption key offline, and maybe even two-factored (something you have and something you know), so that if someone copies your bits, they can't access the currency to transfer it away (i.e. 'take' it).

    It's not hard to spend 'real' money ('fiat currency'; it has value because a government says it has value) when you can arbitrarily convert between BTC and fiat currency.  If your mining rigs have generated 1BTC, you can get an exchange to convert that into a few thousand USD easily, and (1) pay your electric bill, and (2) buy four top-of-the-line GPUs, and still have money left over to fly to Disneyland and let your rig make money for you (literally) while you enjoy yourself.

    The thing is that folks 'mining' BTC actually helps BTC because the work that their computers are doing isn't just random work; it's signing (cryptographically) the last set of transactions (payments between BTC accounts) that happened.  That's what all those GPUs are doing out there, they're agreeing across the world that a set of transactions happened, and signing them.  In exchange, they get 'paid' for the proof-of-work they're doing in BTC.

    nicstt said:

    All money only has value because folks believe it does.

    Bitcoin is no different; if that belief falters, then the price goes down by a fair bit - and potentially collapses if enough folks lose faith.

    Yep.  The big question in the VERY early days was, 'Will people actually start accepting this as payment?'  The answer was a shockingly strong, 'Yes!'  I can pay my hosting bill in BTC, for example.  Yes, there's a lot of folks who use it as a digital cash substitute for all the less-than-savory things that cash is usually used for.  But there's a lot MORE people who use it for everyday things because it's...like having cash, but on the internet.  It's the Science Fiction-y 'credits', pretty much literally.

    I sound a lot more rah-rah than I am; I'm terribly aware of just how...fragile the whole thing is.  The algorithm is (pardon my language) F**KING BRILLIANT, and how everything...hangs together and reinforces itself is absolutely AMAZING.  Imagine the most beautiful artwork you've ever seen; the original BTC paper and implementation is the equivalent for computer science, and the author is (!) anonymous.  The dangerous part is the actual mining of BTC (and it's called out in the abstract for the paper).  If mining DOES crash, then the distributed number of folks signing transactions will drop, which puts more power in the hands of the super-large BTC mining organizations, which could cooperate to attack the chain.  This could lead to a lack of faith in the transaction history (the 'blockchain'), which would lead to a loss of faith in BTC itself, which...as we've described...means that the currency will lose value.

    I'm not invested in BTC at all, I just stand in flat awe of the intellect it took to come up with it all, and have it work at global scale, first time out.

    --  Morgan

     

  • ImagoImago Posts: 5,666
    nicstt said:

    All money only has value because folks believe it does.

    Bitcoin is no different; if that belief falters, then the price goes down by a fair bit - and potentially collapses if enough folks lose faith.

    Not true, real world currencies are based on state's gold possessions. For example, the state of Zumpabalubba (It doesn't exist for real! 0.0 ) owns 10 millions worth gold in its deposits, so it can emitt 10 million of Zumpabalubbian Dollars. The loss of "power" of the currency depends on many factors, but still the state got those 10 millions gold bars to cover the money value. Someone told me it works this way.

    Bitcoins are based on thin air. What pricey mineral covers its values? Silicon, maybe?

    Anyway, even if I don't want Miners to become beggars, I hope this bad situation will end soon. My workstation is giving up, my GPU is begging for mercy...

  • GreymomGreymom Posts: 1,140
    edited March 2018
    Gazukull said:

    I bought a refurbed workstation off ebay with 96 GB RAM in it just for large scenes.  Honestly... You can get into a box like it for less than a 1080 ti.  I run Iray on it when I have a scene that won't fit into my TITAN X's.  

    Just hit ebay and search for 96 GB ram under Computer.  I like the HP Z800 and Lenovo D30.  

    That's the way to do it!  I gradually bought surplus parts to build several workstations similar to the ones you mentioned.   Besides the vendors on Ebay, there are a couple that sell through Newegg too (might be the same ones). When I checked, they would custom build/refurbish, and supply a warranty, for less than the current cost for parts or, as you said, a 1080ti or even a 1080.   Now even the XEON E5-2680V2 10-core cpus are pretty cheap, the E5-2670/2680 V1 8-cores are dropping in price, and the older X56xx dual six-core CPUs and MBs are going for literally pennies on the dollar.  I planned these so that I could have enough crunch power to run VUE render and Carrara/Grid but they will do fine for IRAY too.  Now I just gotta get them built....

    Post edited by Greymom on
  • j cadej cade Posts: 2,310
    Imago said:
    nicstt said:

    All money only has value because folks believe it does.

    Bitcoin is no different; if that belief falters, then the price goes down by a fair bit - and potentially collapses if enough folks lose faith.

    Not true, real world currencies are based on state's gold possessions. For example, the state of Zumpabalubba (It doesn't exist for real! 0.0 ) owns 10 millions worth gold in its deposits, so it can emitt 10 million of Zumpabalubbian Dollars. The loss of "power" of the currency depends on many factors, but still the state got those 10 millions gold bars to cover the money value. Someone told me it works this way.

    Bitcoins are based on thin air. What pricey mineral covers its values? Silicon, maybe?

    Anyway, even if I don't want Miners to become beggars, I hope this bad situation will end soon. My workstation is giving up, my GPU is begging for mercy...

     

    Somebody told you wrong. Money is not based on a countries gold (it sort of used to be a long time ago) that said real currencies tend to be much more stable than cryptocurrencies because they have the stability of a government saying "we think this is worth stuff and support it" as long as the government remains stable and doesn't start printing money out the wazoo the currency will remain relatively stable. Cryptocurrencies don't have that same stable base and so are much more prone to fluctuations in value.

     

    Sidenote on gold, it's value isn't really all that intrinsic either as it's basis is a mix of scarcity and people liking shiny things (also some tech applications but those are a recent development in the grand scheme of things) which is why I find the subset of people who want to go back to the gold standard so weird. Also weird: the political overlap of people who like the gold standard and people who like cryptocurrencies. (I mean I do actually get it I just find it funny)

  • Oso3DOso3D Posts: 15,085

    One big reason people got away from gold standard is because it sets up a situation where a few wealthy bankers could manipulate value very easily by sitting on assets.

    Mind you, they can STILL do that, but the problem was particularly egregious before.

    One of the appealing things about Bitcoin is the idea of a distributed responsibility. Instead of governments directly managing the value of some currency, it's more diffuse.

    Which has its upsides and downsides; while I'm a bit iffy on Bitcoin myself, I'm growing to think a lot of the dismissal of Bitcoin comes from people not really facing how baloney our other money systems are, too. ;)

     

  • AllenArtAllenArt Posts: 7,175

    Money hasn't been based on gold (at least in the US) for decades.

    Laurie

  • KitsumoKitsumo Posts: 1,221
    Rottenham said:
    Kitsumo said:

    I've never understood all those people who predicted the 'death' of the PC. I guess they're just trying to sell newspapers or web page clicks or whatever. In most cases, anyone who does any real work is using a PC. I mean, you probably could code a web page or prepare someones taxes using an iPad, but I don't think it's optimal. Between office workers, scientists, the tech industry and hobbyists (including gamers) the PC is safe from extinction. Sure, the average person may not own a computer anymore, but for me that just means I don't get so many calls from family and friends asking me to fix their computers.

    Agreed that the so-called pundits often stand too close to the spreadsheet to see reality. However, the growth of remote servers continues, and a desktop PC doesn't fit in your pocket. The PC as we know it could become a niche market. If it isn't already.

    And I'm fine with that. I grew up with a Commodore 64 and I'm sure there are some folks here who would call me a young whippersnapper. We've been through tough times before and we'll get through this. People can switch to their cell phones and Netflix and VR goggles or whatever, but to make all that stuff work, somebody has to program something on a PC. So as long as companies are making PCs for industrial and engineering use, they can sell to the home market too, it just won't be their main priority anymore.

    When PCs were mainstream, we gamers, hobbyists and enthusiasts could walk the streets like normal humans. The 'normals' even learned our language, using words like PCI, baud or CPU, often incorrectly, but they made the effort. We no longer had to endure questions like "You have a computer? What for?" Now that winter has fallen upon us, we may have to retreat back to our attics and basements, but we'll always remember that short decade in the sun, when nerds were accepted, and dare I say admired.cheeky Ok maybe that's pushing it.

  • j cadej cade Posts: 2,310
    Oso3D said:

    One big reason people got away from gold standard is because it sets up a situation where a few wealthy bankers could manipulate value very easily by sitting on assets.

    Mind you, they can STILL do that, but the problem was particularly egregious before.

    One of the appealing things about Bitcoin is the idea of a distributed responsibility. Instead of governments directly managing the value of some currency, it's more diffuse.

    Which has its upsides and downsides; while I'm a bit iffy on Bitcoin myself, I'm growing to think a lot of the dismissal of Bitcoin comes from people not really facing how baloney our other money systems are, too. ;)

     

    The biggest problem with that distributed responsibility is Bitcoin isn't necessarily that distributed. It is *very* vulnerable to large mining operations controlling large numbers of Bitcoin and manipulating the market (to what degree this is already happening is unclear) so rather than government control you instead just end up with (let's be generous and say a certain degree of) corporate control which is pretty less than ideal.
  • CypherFOXCypherFOX Posts: 3,401

    Greetings,

    Oso3D said:

    One of the appealing things about Bitcoin is the idea of a distributed responsibility. Instead of governments directly managing the value of some currency, it's more diffuse.

    Which has its upsides and downsides; while I'm a bit iffy on Bitcoin myself, I'm growing to think a lot of the dismissal of Bitcoin comes from people not really facing how baloney our other money systems are, too. ;)

    ^^^ This.

    The truth is that if the US government (or other large governments) wanted to destablize BTC, they could through the application of massive amounts of computing.  Mind you it would take the equivalent of two large nuclear power plants to provide power for it, but it's absolutely doable, and it would take less than a year to destablize the currency sufficiently.

    The comment about gold is why I specifically call out what we think of as 'real' currencies as 'fiat' currencies; they exist, and have value, because the government backing them says they have value, and people have faith in the government.  Not just you and I, but financiers across the world.  They invest in that currency (Yuan, Yen, Dollar, Euro, etc.) because they trust the governments behind them not to print arbitrarily large amounts of it and to pay their bills.

    This is far afield of the problem of nVidia GPUs not being available.  In the end if a cryptographic solution is strong enough and easy enough to explain and implement, and remains secure, it gets the benefit of that same 'faith' that it's not going to print arbitrarily large quantities (because the amount available is limited algorithmically), and that it's not going to go away (because transactions are being done in it).  As long as that faith is in place, we've built an international network that makes transacting in digital currencies very, very easy.

    ...and thus, unfortunately, I don't see cryptocurrencies going away any time soon, and equivalently the hunger for GPUs.  I believe the ultimate outcome will be that nVidia will ramp up production for a while, until a more tailored solution becomes available, at which point they'll gratefully settle back into being a producer for gaming cards.

    ...

    It's worth noting, though, that rendering is a process very similar to calculating hashes, and any hardware that gets better at cryptomining _could_ in theory be repurposed to improve rendering.  This might be a 'darkest before the dawn' thing.  If there's a boom in massively parallel compute units in order to sate the cryptocurrency miners, it's possible it could throw off other interesting technologies that we can use for rendering.

    --  Morgan

     

  • kyoto kidkyoto kid Posts: 41,873
    Rottenham said:
    Taoz said:

    I'm not sure how VRAM works on a video card, but if there is a lot of constant read/write operations SSD wouldn't last long. Besides, VRAM is a lot faster than SSD, and they're making it faster and faster, which there must be a reason for. But maybe that's only relevant for gaming, and not still renders.

    This article says, "VRAM is a special type of DRAM which is dual-ported. It still has a duty cycle, but it can write to and read from at the same time. In practice, this means that you get double the bandwidth out of 60 ns VRAM as you would out of 60 ns DRAM (if implemented correctly on the video card)."

    That said, it is just a little chip. It could be soldered on an M.2 card (or some such) and plugged into anything - a video card, for example. The video driver would never know the difference.

    I can't believe this hasn't been thought of and discussed at AMD and Nvidia. My guess is, the profits are much greater in selling you an entire replacement video card than allowing you to upgrade it. Soldering components in rather than socketing them is always cheaper to begin with. Why mess with success? They can't make the stuff they have now fast enough.

    If there's no supply, there's not much profit. Trickle-up economics works just fine. Upgrade paths are a viable alternative for more people than the direct-upgrade path. Nvidia targets two specific markets: gamers with the GTX and brainiacs with the Quadro. Us wee folk on the rendered fringe using low-cost solutions are not "big players" in their ledgers. With the mining boom, they're exploring that market. Daz Studio users are not on their radar because there are a thousand hobbyists for every "getting paid to do it" artist. As well, a lot of us are also gamers, so we're in that market. Those few who are looking at building rendering rigs slowly and with as little lump-sum expense as possible are not paying their monthly bills by making huge purchases at once, but many streams combine to form a mighty river.

    I still say their best move, and one that's easy for them to do, is build a GTX-based VCA box. It doesn't need 256GB of ECC RAM. Leave it user-configurable for CPU and standard RAM (single CPU or double, Xeon or i7, or whatever), and it doesn't have to run from a server. They can even cut it down so it only takes 4 GPUs, so long as you can add several to a single system. It'd be a lot cleaner than the Amfeltec GPU cluster, which is an electrical, fire, and trip hazard all at once.
    Not to mention it'd be in a snazzy rackmount box.

     

    The problem with this is the same as with the current situation; while it might be targeted at the artistic folks, if it can be made to mine, you can bet your next purchase cost here that they would snap them up like they are doing with video cards now.

    ...exactly, a mining rig in a convenient box.

    It would also be prohibitively expensive for most folks.  If it were built on the same platform as the current VCA (which has 8 Quadro P6000s) using 1080 Ti's instead, even at the original base cost per card  you are looking at 5,600$ just for the cards alone. You would still need a CPU, some memory, and the interconnects as well. It also would only give you 11 GB of VRAM for rendering as like I mentioned, VRAM does not pool for that process, so the only advantage is that you have 28,672 cores to speed the process up or batch process multiple render jobs at the same time.

  • kyoto kidkyoto kid Posts: 41,873
    Greymom said:

    I was in a computer store the other day buying an SD card for my camcorder and there was a guy in front of me with a dozen Nvidia graphics cards. When he left I asked the cashier "bitcoin mining" and he said, "yeah, he comes in every week and buys every video card we've got and then bitches at us because we don't have more".

    I think as long as that's happening you're not going to see much relief in the price of graphics cards.

    angry  Some bars still have signs "We don't server minors."   Maybe it's time for computer stores to put up "We don't server cryptominers".

    ...yes

  • kyoto kidkyoto kid Posts: 41,873
    Gazukull said:

    I bought a refurbed workstation off ebay with 96 GB RAM in it just for large scenes.  Honestly... You can get into a box like it for less than a 1080 ti.  I run Iray on it when I have a scene that won't fit into my TITAN X's.  

    Just hit ebay and search for 96 GB ram under Computer.  I like the HP Z800 and Lenovo D30.  

    ...I've even found 128 GB dual xeon workstation servers with old Quadro cards (basically with enough memory to run the displays) for about what I paid to build my system.

Sign In or Register to comment.