OT Update 2: Nvidia & AMD about to lose a lot of sales from cryptominers?

17810121318

Comments

  • kyoto kidkyoto kid Posts: 41,861
    Imago said:
     

    Bitcoins are based on thin air. What pricey mineral covers its values? Silicon, maybe?
     

    ...same for US currency. It is little more than a "promissory note" issued by the Federal Reserve (hence the words "Federal Reserve Note" instead of "Gold" or "Silver Certificate") as we went off the gold and silver standard last century.

  • kyoto kidkyoto kid Posts: 41,861
    CypherFOX said:

    Greetings,

    Oso3D said:

    One of the appealing things about Bitcoin is the idea of a distributed responsibility. Instead of governments directly managing the value of some currency, it's more diffuse.

    Which has its upsides and downsides; while I'm a bit iffy on Bitcoin myself, I'm growing to think a lot of the dismissal of Bitcoin comes from people not really facing how baloney our other money systems are, too. ;)

    ^^^ This.

    The truth is that if the US government (or other large governments) wanted to destablize BTC, they could through the application of massive amounts of computing.  Mind you it would take the equivalent of two large nuclear power plants to provide power for it, but it's absolutely doable, and it would take less than a year to destablize the currency sufficiently.

    The comment about gold is why I specifically call out what we think of as 'real' currencies as 'fiat' currencies; they exist, and have value, because the government backing them says they have value, and people have faith in the government.  Not just you and I, but financiers across the world.  They invest in that currency (Yuan, Yen, Dollar, Euro, etc.) because they trust the governments behind them not to print arbitrarily large amounts of it and to pay their bills.

    This is far afield of the problem of nVidia GPUs not being available.  In the end if a cryptographic solution is strong enough and easy enough to explain and implement, and remains secure, it gets the benefit of that same 'faith' that it's not going to print arbitrarily large quantities (because the amount available is limited algorithmically), and that it's not going to go away (because transactions are being done in it).  As long as that faith is in place, we've built an international network that makes transacting in digital currencies very, very easy.

    ...and thus, unfortunately, I don't see cryptocurrencies going away any time soon, and equivalently the hunger for GPUs.  I believe the ultimate outcome will be that nVidia will ramp up production for a while, until a more tailored solution becomes available, at which point they'll gratefully settle back into being a producer for gaming cards.

    ...

    It's worth noting, though, that rendering is a process very similar to calculating hashes, and any hardware that gets better at cryptomining _could_ in theory be repurposed to improve rendering.  This might be a 'darkest before the dawn' thing.  If there's a boom in massively parallel compute units in order to sate the cryptocurrency miners, it's possible it could throw off other interesting technologies that we can use for rendering.

    --  Morgan

     

    ...Nvidia and AMD ramping up production has two snags, the shortage of memory chips and as mentioned above, now a shortage of silicon wafers. If you don't have an abundance of the raw components needed any increase in production rate will be limited to the supply available (unless Nvidia and AMD go into the business of fabricating wafers and memory chips themselves). 

    Given the fact (as again mentioned above) someone comes into a store and buys up all the GPU cards on the shelf, or Nvidia is unable keep them in stock in their own store in spite of strict purchasing limits, It appears it would require an extremely sizeable jump in production balance the supply, possibly one greater than could be accommodated by the supply of memory chips and wafers. 

    Also will doing so alleviate the situation? Let's say that GPU production rate was doubled and there was a drop in prices to say about 30$ - 50$ above where they were over a year ago, what's to stop miners from scarfing up even more cards at the lower prices to further increase the speed and efficiency of their rigs?  We could easily find ourselves in the same situation.  Nvidia can request that vendors impose limits until they are blue in the face, but that is up to the individual vendor (and again as we are seeing even Nvidia cannot maintain a stable supply for direct sale so apparently there are ways of getting around their 2 card limit).

  • Charlie JudgeCharlie Judge Posts: 13,253

    With Google banning advertising for cryptocurrance maybe the market will slow down some:

    https://www.aol.com/article/finance/2018/03/14/google-bans-cryptocurrency-advertising-bitcoin-price-slumps/23385997/

  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • kyoto kid said:
    Rottenham said:
    Taoz said:

    I'm not sure how VRAM works on a video card, but if there is a lot of constant read/write operations SSD wouldn't last long. Besides, VRAM is a lot faster than SSD, and they're making it faster and faster, which there must be a reason for. But maybe that's only relevant for gaming, and not still renders.

    This article says, "VRAM is a special type of DRAM which is dual-ported. It still has a duty cycle, but it can write to and read from at the same time. In practice, this means that you get double the bandwidth out of 60 ns VRAM as you would out of 60 ns DRAM (if implemented correctly on the video card)."

    That said, it is just a little chip. It could be soldered on an M.2 card (or some such) and plugged into anything - a video card, for example. The video driver would never know the difference.

    I can't believe this hasn't been thought of and discussed at AMD and Nvidia. My guess is, the profits are much greater in selling you an entire replacement video card than allowing you to upgrade it. Soldering components in rather than socketing them is always cheaper to begin with. Why mess with success? They can't make the stuff they have now fast enough.

    If there's no supply, there's not much profit. Trickle-up economics works just fine. Upgrade paths are a viable alternative for more people than the direct-upgrade path. Nvidia targets two specific markets: gamers with the GTX and brainiacs with the Quadro. Us wee folk on the rendered fringe using low-cost solutions are not "big players" in their ledgers. With the mining boom, they're exploring that market. Daz Studio users are not on their radar because there are a thousand hobbyists for every "getting paid to do it" artist. As well, a lot of us are also gamers, so we're in that market. Those few who are looking at building rendering rigs slowly and with as little lump-sum expense as possible are not paying their monthly bills by making huge purchases at once, but many streams combine to form a mighty river.

    I still say their best move, and one that's easy for them to do, is build a GTX-based VCA box. It doesn't need 256GB of ECC RAM. Leave it user-configurable for CPU and standard RAM (single CPU or double, Xeon or i7, or whatever), and it doesn't have to run from a server. They can even cut it down so it only takes 4 GPUs, so long as you can add several to a single system. It'd be a lot cleaner than the Amfeltec GPU cluster, which is an electrical, fire, and trip hazard all at once.
    Not to mention it'd be in a snazzy rackmount box.

     

    The problem with this is the same as with the current situation; while it might be targeted at the artistic folks, if it can be made to mine, you can bet your next purchase cost here that they would snap them up like they are doing with video cards now.

    ...exactly, a mining rig in a convenient box.

    It would also be prohibitively expensive for most folks.  If it were built on the same platform as the current VCA (which has 8 Quadro P6000s) using 1080 Ti's instead, even at the original base cost per card  you are looking at 5,600$ just for the cards alone. You would still need a CPU, some memory, and the interconnects as well. It also would only give you 11 GB of VRAM for rendering as like I mentioned, VRAM does not pool for that process, so the only advantage is that you have 28,672 cores to speed the process up or batch process multiple render jobs at the same time.

    If they offered the GTX version in 4, 6, or 8 GPU versions, or just a 4-GPU version that can be Thunderbolted together in a chain, it'd be easily afforable. If they sold them without GPUs, then users could spread that expense over time. Build your rig up from cheaper 700 series cards to 900 series and then to 1000 series, or just stick with the older 8GB models and add more 4-GPU boxes and more cheaper GPUs.

     

    As for the situation at-hand, a possible solution is gaming laptops for the gamers, and integrated Nvidia GPUs in the ready-made desktops from Dell, Alienware, HP, etc, possibly with a system-restricted BIOs that would not let the GPU function outside of the mobo it was keyed to. By encrypting the BIOs, it couldn't be flashed by the user to break the brand lock. System manufacturers could still sell complete systems to gamers, maybe even have a rendering model line, and the miners couldn't buy up the systems to strip their GPUs out.

    On the other hand, we end up with the old iMacs, and you're stuck with whatever you bought, and couldn't resell it to save your life.

     

    On the issue of currency, shiny metals, sparkly stones, and printed paper can buy food because a long time ago some idiot agreed to trade food for shiny metals and sparkly stones, and some years later, a descendant of that idiot agreed that printed paper had the value of food. Then a gaggle of morons agreed that a lot of food could be traded for a little bit of shiny metal, sparkly stones, or printed paper.

    To this day, society believes that shiny metal, sparkly stones, and printed paper have a hgher value than food, because "it's always been that way." When humans are finally erased from the face of the planet, that stupidity will die with them. It will not vanish before then.

    Bitcoin has value because someone agreed to it. It's the Facebook story of the guy who traded a red paperclip for a house. It started off as a joke that blew up into a running gag that blew up into a fad that blew up into an international sensation that blew up into a legitimate currency that will blow up in a lot of faces when reality sets in and everyone tries to cash out. It's vapor.

     

    On the subject of mining rig thefts, there was a story about Russian nuclear scientists using the government supercomputers to mine bitcoins, and they connected the most-secure computers in Russia - the ones that control their nuclear arsenal and power grid - to the interwebs to cash out. It's my understanding Russia is looking for something worse than a pre-war Siberian gulag to put them in.

  • KitsumoKitsumo Posts: 1,221
    Rottenham said:

    I'll swap gray hairs with you any time.  wink

    The problem with PCs becoming a niche market is that parts would become less available and more expensive. It's already expensive enough for me. I imagine you remember the scene from Soylent Green where the hotel manager said, "A resistor burned up. We'll have to make one."

     

    Yeah, PC parts probably will get more expensive. I guess I'll just have to live with it. There's no way I'm giving up my PC. I still haven't seen Soylent Green. It's on a long list of movies I need to see.

  • kyoto kidkyoto kid Posts: 41,861
    kyoto kid said:
    Rottenham said:
    Taoz said:

    I'm not sure how VRAM works on a video card, but if there is a lot of constant read/write operations SSD wouldn't last long. Besides, VRAM is a lot faster than SSD, and they're making it faster and faster, which there must be a reason for. But maybe that's only relevant for gaming, and not still renders.

    This article says, "VRAM is a special type of DRAM which is dual-ported. It still has a duty cycle, but it can write to and read from at the same time. In practice, this means that you get double the bandwidth out of 60 ns VRAM as you would out of 60 ns DRAM (if implemented correctly on the video card)."

    That said, it is just a little chip. It could be soldered on an M.2 card (or some such) and plugged into anything - a video card, for example. The video driver would never know the difference.

    I can't believe this hasn't been thought of and discussed at AMD and Nvidia. My guess is, the profits are much greater in selling you an entire replacement video card than allowing you to upgrade it. Soldering components in rather than socketing them is always cheaper to begin with. Why mess with success? They can't make the stuff they have now fast enough.

    If there's no supply, there's not much profit. Trickle-up economics works just fine. Upgrade paths are a viable alternative for more people than the direct-upgrade path. Nvidia targets two specific markets: gamers with the GTX and brainiacs with the Quadro. Us wee folk on the rendered fringe using low-cost solutions are not "big players" in their ledgers. With the mining boom, they're exploring that market. Daz Studio users are not on their radar because there are a thousand hobbyists for every "getting paid to do it" artist. As well, a lot of us are also gamers, so we're in that market. Those few who are looking at building rendering rigs slowly and with as little lump-sum expense as possible are not paying their monthly bills by making huge purchases at once, but many streams combine to form a mighty river.

    I still say their best move, and one that's easy for them to do, is build a GTX-based VCA box. It doesn't need 256GB of ECC RAM. Leave it user-configurable for CPU and standard RAM (single CPU or double, Xeon or i7, or whatever), and it doesn't have to run from a server. They can even cut it down so it only takes 4 GPUs, so long as you can add several to a single system. It'd be a lot cleaner than the Amfeltec GPU cluster, which is an electrical, fire, and trip hazard all at once.
    Not to mention it'd be in a snazzy rackmount box.

     

    The problem with this is the same as with the current situation; while it might be targeted at the artistic folks, if it can be made to mine, you can bet your next purchase cost here that they would snap them up like they are doing with video cards now.

    ...exactly, a mining rig in a convenient box.

    It would also be prohibitively expensive for most folks.  If it were built on the same platform as the current VCA (which has 8 Quadro P6000s) using 1080 Ti's instead, even at the original base cost per card  you are looking at 5,600$ just for the cards alone. You would still need a CPU, some memory, and the interconnects as well. It also would only give you 11 GB of VRAM for rendering as like I mentioned, VRAM does not pool for that process, so the only advantage is that you have 28,672 cores to speed the process up or batch process multiple render jobs at the same time.

    If they offered the GTX version in 4, 6, or 8 GPU versions, or just a 4-GPU version that can be Thunderbolted together in a chain, it'd be easily afforable. If they sold them without GPUs, then users could spread that expense over time. Build your rig up from cheaper 700 series cards to 900 series and then to 1000 series, or just stick with the older 8GB models and add more 4-GPU boxes and more cheaper GPUs.

     

    As for the situation at-hand, a possible solution is gaming laptops for the gamers, and integrated Nvidia GPUs in the ready-made desktops from Dell, Alienware, HP, etc, possibly with a system-restricted BIOs that would not let the GPU function outside of the mobo it was keyed to. By encrypting the BIOs, it couldn't be flashed by the user to break the brand lock. System manufacturers could still sell complete systems to gamers, maybe even have a rendering model line, and the miners couldn't buy up the systems to strip their GPUs out.

    On the other hand, we end up with the old iMacs, and you're stuck with whatever you bought, and couldn't resell it to save your life.

     

    On the issue of currency, shiny metals, sparkly stones, and printed paper can buy food because a long time ago some idiot agreed to trade food for shiny metals and sparkly stones, and some years later, a descendant of that idiot agreed that printed paper had the value of food. Then a gaggle of morons agreed that a lot of food could be traded for a little bit of shiny metal, sparkly stones, or printed paper.

    To this day, society believes that shiny metal, sparkly stones, and printed paper have a hgher value than food, because "it's always been that way." When humans are finally erased from the face of the planet, that stupidity will die with them. It will not vanish before then.

    Bitcoin has value because someone agreed to it. It's the Facebook story of the guy who traded a red paperclip for a house. It started off as a joke that blew up into a running gag that blew up into a fad that blew up into an international sensation that blew up into a legitimate currency that will blow up in a lot of faces when reality sets in and everyone tries to cash out. It's vapor.

     

    On the subject of mining rig thefts, there was a story about Russian nuclear scientists using the government supercomputers to mine bitcoins, and they connected the most-secure computers in Russia - the ones that control their nuclear arsenal and power grid - to the interwebs to cash out. It's my understanding Russia is looking for something worse than a pre-war Siberian gulag to put them in.

    ..that may solve the speed issue but not the VRAM one as you'll still be limited to whatever card has the highest VRAM. For those into animation, yes such a setup would be great.  For someone like myself who creates large very detailed gallery quality works, it would be pointless as more often than not te job would dump back to the CPU and all those linked together cores would be useless.  In the "Might GPU Prices Prompt a Return to CPU-based PBR?" thread. one person mentions of an Iray scene swelling to 32 GB during the render process.  This would even dump off a 5,000$+ 24 GB P6000 which is the highest VRAM card available.

  • kyoto kid said:

     one person mentions of an Iray scene swelling to 32 GB during the render process.  This would even dump off a 5,000$+ 24 GB P6000 which is the highest VRAM card available.

    Just to be clear, the OP was talking about system RAM.

    As for VRAM game, maybe learn from my mistake.  I traded a 1070 8G for a used P5000 16G for $800.  Very difficult decision for me.  I should have known better. 

    Place 8 actors and fill with props approaching 14G I think.  Then watch the IRay render's geometry eat 32GB of RAM for lunch.  So I can't even use this card's potential unless I spend (waste?) even more money upgrading to 64G ram? 

     

  • kyoto kidkyoto kid Posts: 41,861

    ...from the description it sounded like it dumped to the CPU thus bloating the total load on system memory (render file plus open scene file).   This is my one issue  on having such a resource demanding render engine like Iray integrated rather than being a standalone like Lux, as once you sent a scene to the Lux engine, you could close it (the scene) and even shut the Daz progamme down which freed up a good chunk of system memory. The only downside with Lux, it was even slower than Iray in CPU mode by several magnitudes.

  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • nicsttnicstt Posts: 11,715
    edited March 2018
    Oso3D said:
    nicstt said:

    All money only has value because folks believe it does.

    Bitcoin is no different; if that belief falters, then the price goes down by a fair bit - and potentially collapses if enough folks lose faith.

    Well, by that argument, maybe national economies will have a breakdown and regular money will lose value, the world market sustained only by bitcoin...

     

    Not impossible, but unlikely atm.

    But the same applies, national currencies have value because of folks's belief; The UK bank note promises to pay bearer the sum of X pounds. People believe it, so it has value.

    Imago said:
    nicstt said:

    All money only has value because folks believe it does.

    Bitcoin is no different; if that belief falters, then the price goes down by a fair bit - and potentially collapses if enough folks lose faith.

    Not true, real world currencies are based on state's gold possessions. For example, the state of Zumpabalubba (It doesn't exist for real! 0.0 ) owns 10 millions worth gold in its deposits, so it can emitt 10 million of Zumpabalubbian Dollars. The loss of "power" of the currency depends on many factors, but still the state got those 10 millions gold bars to cover the money value. Someone told me it works this way.

    Bitcoins are based on thin air. What pricey mineral covers its values? Silicon, maybe?

    Anyway, even if I don't want Miners to become beggars, I hope this bad situation will end soon. My workstation is giving up, my GPU is begging for mercy...

    You are missing the point; have you seen that gold, do you know that they worlds curencies are backed by sufficient gold, (which they are not iirc see below as I decided to make sure I was remembering correctly)? Governments state various things about their reserves. So it is only becuase of the faith in said currency that it works. Many folks believe there is the gold to back everything; there isn't. But even the value of gold is down to belief, faith and its intrinsic scarcity - for much of history it is has been useless for most things other than looking good. People valued it due to its appearance and scarcity and so began its use in commerce.

     

    There is far more money, than there is in all the gold. $8,037,266,804,300 - 8 trillion (http://onlygold.com/Info/All-The-Gold-In-The-World.asp)

    36.8 trillion (https://www.marketwatch.com/story/this-is-how-much-money-exists-in-the-entire-world-in-one-chart-2015-12-18), up to  $90.4 trillion depending on how it is calculated. This is not including various electronic currencies.

     

    Post edited by nicstt on
  • laststand6522732laststand6522732 Posts: 866
    edited March 2018

    ~

    Post edited by laststand6522732 on
  • Richard HaseltineRichard Haseltine Posts: 108,110

    The nature-of-money discussion is flirting with politics - it's not necessary for the main topic of this thread so please drop it now.

  • ImagoImago Posts: 5,664
    edited March 2018

    Sorry, Richard, my bad! blush

    Back to the topic, I found an article that says the bitcoin will drop to 1000$ on May...

    Maybe our pain is about to end...

    But my concern is: even if bitcoin loses all its value, market could start point to another cryptocurrency!

    Post edited by Imago on
  • kyoto kidkyoto kid Posts: 41,861

    ...most likely Ethereum and Monero which have not been as much a roller coaster as Bitcoin.  Just because something was "first"on the scene, doesn't mean it will remain in that position.

  • ImagoImago Posts: 5,664

    Yes, it is what I meant... The craze for mining could remain even without Bitcoin in the scene. Those crazy people who owns 30 rigs with 20 GTX 1980ti for each surely will start mining something else... And buy more cards in order to reach previous incomes.

    I hope some developer will finally create a "mining chip" that will tale the GPUs place in those rigs!T he market is pretty lively, I'm astonished that nobody already did this!

  • CypherFOXCypherFOX Posts: 3,401

    Greetings,

    Imago said:

    I hope some developer will finally create a "mining chip" that will tale the GPUs place in those rigs!T he market is pretty lively, I'm astonished that nobody already did this!

    They did; at least one of the companies producing ASIC solutions was...well, charitably they failed to deliver before their product was obsolete, and got in a ton of trouble.

    A custom ASIC can do X giga-hashes per second, but the complexity of mining increases somewhat regularly, and so the return on X GHash/sec is on a constant downward pressure.  The thing about GPUs is that they can be programmed to do any of the hashing algorithms out there, and so companies like NiceHash can dynamically adjust the folks who've signed up to mine with them to a cryptocurrency with the optimal balance of processing power to cryptocoin value.  If all you're doing is Bitcoin, an ASIC solution is interesting, but you're part of a VERY competitive mining world.  If you need to be able to dynamically switch which hash algorithm you're going to run, ASICs aren't going to help you.  If you're running Ethereum, you're even more hosed because Ethereum requires a virtual machine because it includes the ability to run (very limited, but interesting) code as part of the transaction, and so it really can't run on an ASIC unless your custom hardware includes an EVM, which...seems unlikely.

    So...yes, folks have done this.  And no, it's not very straightforward...

    --  Morgan

     

  • ImagoImago Posts: 5,664
    edited March 2018
    CypherFOX said:
    So...yes, folks have done this.  And no, it's not very straightforward...

    --  Morgan

     

    Oh, boy!
    But... Correct me if I'm wrong, but somewhere I heard that there is a sort of "bottle neck" in that system and you can't obtain nothing more than that limit.

    But I'm not sure about it, just something I heard some time ago, about the time this craziness started. It was one of the reasons that made me drop the idea.

    And the reason I'm wondering why they are scavenging every GPU in sight...

    Post edited by Imago on
  • KitsumoKitsumo Posts: 1,221

    As I understand it, GPU mining for Bitcoin isn't profitable and hasn't been for a long time. Bitcoin miners use ASICs almost exclusively. The main source of our pain is Ethereum, which was made to be ASIC resistant, so it requires GPUs. You can just look at https://bitgur.com/map and see that even if ETH does fall out of favor, there are at least a dozen currencies waiting to take it's place. I think the best we can hope for is that investors start pulling more money out of cryptocurrencies in general.

  • outrider42outrider42 Posts: 3,679
    edited March 2018

    There was a comment in here about how how crypto currencies haven't quite yet hit critical mass like the housing boom did. They mentioned that everybody was trading houses, even local barbers. And that until practically everybody is doing it, mining has not yet hit that bubble.

    One complication with that is that mining crypto currency is fundamentally different from selling houses. Everybody on the planet knows what a house is. And houses are physical things, you can see what you are buying and selling. Crypto currency can never achieve that, and while there are many people who may have heard of it, most have no idea what it is. There are millions of people who use the cloud, but they have no idea what the cloud actually is. We still have commercials on TV about the cloud, as if it is some weird far away thing. It is an aspect of new technology that some people will always avoid. This aspect alone will always insulate mining to some degree. But more people are paying attention, even if they have no idea what this is. You can watch a nicely informative video from John Oliver discussing all things crypto. This in itself is a big thing, as Oliver brings to light what this is for the masses via HBO and millions of views on youtube. You also get a fantastically terrifying chicken nugget metaphor, and some lame jokes about a fake know it all office worker.

    Do note, there is some foul language is in the video. The video is not political, just a somewhat comedic overview of cryto currencies and blockchain technology.

    I don't believe the market needs the wild speculation like housing market to hit critical mass, it is already there in its own way. As the video states, there are over 1500 different currencies 1500. That...is too many. A lot of these will fail, and bring down companies that push them. There are many very shady businesses doing very shady things, also discussed in the video, with crypto currencies. Then you have google stopping ads, and there will probably be a lot of regulation. I don't think crypto will die or bust like most bubbles. Rather it will fall back down to reality and sort of stabilize. But that will be a crash in its own way, as those who invested too much will be in for a very bad day.

    Post edited by outrider42 on
  • GreymomGreymom Posts: 1,140
    edited March 2018
    Imago said:

     

    Post edited by Greymom on
  • GreymomGreymom Posts: 1,140

     

    Kitsumo said:

    As I understand it, GPU mining for Bitcoin isn't profitable and hasn't been for a long time. Bitcoin miners use ASICs almost exclusively. The main source of our pain is Ethereum, which was made to be ASIC resistant, so it requires GPUs. You can just look at https://bitgur.com/map and see that even if ETH does fall out of favor, there are at least a dozen currencies waiting to take it's place. I think the best we can hope for is that investors start pulling more money out of cryptocurrencies in general.

    As I understand it, the Ethereum software loads a large dataset into VRAM and crunches away at it, sort of like GPU rendering.  The minimum memory required is 3 GB, and newer versions of the app will supposedly require even more.   New cards with more, faster, memory (and more cores too)  are advantaged.  According to the mining sites, my 2-year-old R9-290x 8 GB card could net about $50 per year.  A 1070ti more like $450.  It looks like most GPUs have a 2-3 year payback time.

     

  • kyoto kidkyoto kid Posts: 41,861

    @ outrider42

    ..that was great. "Chicken up your nugs", need to remember that line.

    Indeed, What Oliver is getting at pretty much supports my view of the whole situation as having turned into little more than a fad, a get rich quick scheme, the latest"pet rock" or "dehydrated water".

    If I had fewer scruples than I do, I could come up with a coin, call it the KKoin or something stupid like that, do a Pump & Dump and then retire to Tahiti.

  • outrider42outrider42 Posts: 3,679
    edited March 2018
    Yeah, the big daddy Bitcoin moved beyond most normal people's means a long time ago. But like a I said, there over 1500 currencies out there, and most of them that can be mined are still fair game for GPU. Most of our GPU woes do not come from bitcoin at all. This whole thing is due almost entirely to Etherium.

    The crazy thing to me is that it is seemingly designed to stagnate one day. There is a finite amount of bitcoin that can ever be mined. That makes sense, but it also gets harder to mine. Eventually it will not be worth mining anymore, as the cost will be too great. So this creates a problem. Does bitcoin gain value being rare, or does it fall because nobody mines it anymore? What happens when that final bitcoin is mined?

    It is indeed a fad, but it will have lasting impacts. Blockchain is here to stay, and will be used in all sorts of things. There is even a blockchain rendering service, which touts allowing users to access "millions of GPUs" around the world. Now that could be kind of cool if it actually works. I forget the name, but a quick Google search should find it.
    Post edited by outrider42 on
  • ImagoImago Posts: 5,664
    Greymom said:

    As I understand it, the Ethereum software loads a large dataset into VRAM and crunches away at it, sort of like GPU rendering.  The minimum memory required is 3 GB, and newer versions of the app will supposedly require even more.   New cards with more, faster, memory (and more cores too)  are advantaged.  According to the mining sites, my 2-year-old R9-290x 8 GB card could net about $50 per year.  A 1070ti more like $450.  It looks like most GPUs have a 2-3 year payback time.

     I saw a video that says the same, costs are way more greater than incomes. So why they insist on it?

    In that video the guy spent about 2000$ in a "base" rig with 4 GTX 1970ti and based on how much energy it consumed summed to the initial costs, even with the currency at high values, ne required 2 years only to gain back the 2000$ dollars he initially spent!

    About the "last" bitcoin... I guess it is the only way we can get rid of the GPU shortage. Once the mine is empty, miners will put down their pickaxes...

    It will be only a Wallstreet matter!

  • KitsumoKitsumo Posts: 1,221
    Imago said:
    Greymom said:

    As I understand it, the Ethereum software loads a large dataset into VRAM and crunches away at it, sort of like GPU rendering.  The minimum memory required is 3 GB, and newer versions of the app will supposedly require even more.   New cards with more, faster, memory (and more cores too)  are advantaged.  According to the mining sites, my 2-year-old R9-290x 8 GB card could net about $50 per year.  A 1070ti more like $450.  It looks like most GPUs have a 2-3 year payback time.

     I saw a video that says the same, costs are way more greater than incomes. So why they insist on it?

    In that video the guy spent about 2000$ in a "base" rig with 4 GTX 1970ti and based on how much energy it consumed summed to the initial costs, even with the currency at high values, ne required 2 years only to gain back the 2000$ dollars he initially spent!

    About the "last" bitcoin... I guess it is the only way we can get rid of the GPU shortage. Once the mine is empty, miners will put down their pickaxes...

    It will be only a Wallstreet matter!

    Yep, despite how the media tries to portray it, I don't see cryptominers driving around in Ferraris. No matter how prices rise or fall, mining difficulty adjusts to the point where it takes 1 1/2 - 2 years to cover hardware costs. And miners that do make a profit usually end up buying more hardware just to try to stay ahead of the game. I think it's gotten worse this past year because the economy's doing so well. People have a lot more money to throw around. When we have a recession, things will cool down a little. No politics.

    That's probably why AMD & NVDA don't want to open new factories just to produce more cards. It would be hard to explain to shareholders why you have a brand new multi billion dollar factory sitting idle and a warehouse full of cards that you can't sell because you crashed the market. Intel is working on a discrete GPU and they've hired a senior executive away from AMD to help. But it will most likely be a range of mid level cards and it won't reach the market any time soon. It won't help Iray users directly, but it will be nice to have another player in the 3d market.

  • ImagoImago Posts: 5,664
    edited March 2018
    Kitsumo said:

     Intel is working on a discrete GPU and they've hired a senior executive away from AMD to help. But it will most likely be a range of mid level cards and it won't reach the market any time soon. It won't help Iray users directly, but it will be nice to have another player in the 3d market.

    Never heard about it! But as you say, it will be very nice have a "third wheel" in the market, it will spice up a bit competition making prices and quality better.

    Or at least it will give miners another target to aim to...

    Post edited by Imago on
  • LlynaraLlynara Posts: 4,772

    Well, I took the plunge. Found a new EVGA GeForce 1080Ti for $1200 on Amazon. More than I wanted to pay, but not the $1500 or more some people were asking. I've been looking all over the web for some time. I've waited two years for this card, and decided to pinch elsewhere and just do it. I wasn't going to wait another year or two for the price to go back down to $800 range that it was around Christmas. Still kicking myself for not getting it then, but didn't have the funds. I'm happy with my card, and I better be, because I'm done making hardware purchases for a while.

  • PadonePadone Posts: 4,016
    edited March 2018
    Llynara said:

    Found a new EVGA GeForce 1080Ti for $1200 on Amazon.

    It's $960 on newegg .. Also you may want to consider a $530 1070 instead, a little slower but same 8GB vram for rendering.

    https://www.videocardbenchmark.net/gpu.php?id=3699

    https://www.videocardbenchmark.net/gpu.php?id=3521

    https://www.newegg.com/Product/Product.aspx?Item=N82E16814125955

    https://www.newegg.com/Product/Product.aspx?Item=N82E16814137092

    As for mining, to me it seems a game that big daddy created to fund new technology. I mean, you always need a faster hardware to get the next coin so profit is striclty time limited. Also it's not real money until you get an investor to pay for it. While value seems to be based on market bubbles so it's pure speculation. In a word it's just gambling for tech people.

    Post edited by Padone on
  • KitsumoKitsumo Posts: 1,221
    edited March 2018
    Imago said:
    Kitsumo said:

     Intel is working on a discrete GPU and they've hired a senior executive away from AMD to help. But it will most likely be a range of mid level cards and it won't reach the market any time soon. It won't help Iray users directly, but it will be nice to have another player in the 3d market.

    Never heard about it! But as you say, it will be very nice have a "third wheel" in the market, it will spice up a bit competition making prices and quality better.

    Or at least it will give miners another target to aim to...

    Here's an article. Like most tech articles, it's short on details and doesn't give a date, but at least gives us hope. I'm sure some of the old timers can remember a day when we had more than 2 graphics card companies (3dfx, Rendition, PowerVR, Matrox, S3, etc).

     

    Post edited by Kitsumo on
Sign In or Register to comment.