ASIC-Mining

From Glitchdata
Jump to navigation Jump to search

GPU's are horribly inefficient compared to ASIC's. Consider that the absolute best GPU's get around 1 GH/s and cost in the $400 range. An Antminer U2 costs around $20 on ebay and gets 2 GH/s. Antminer U3 costs around $60 and gets 60 GH/s.

Roughly speaking at current network hash rates and current bitcoin price, if you get electricity for free, you can make back the hardware costs of an Antminer U3 in about a year. Of course, if the hash rate goes up, it'll take longer and if the bitcoin price goes up, it'll take shorter.

An ASIC is expensive in quantity one ($2,000,000) but cheap in quantity 100,000 ($5,000,000). They can mine unbelievably well (perhaps 1GHash/s), but nobody (as far as we know) has gone to the expense of making one for bitcoin mining yet.


What process are we using? We're keeping that close to our chest at the moment. However, I can confirm that our detailed modelling at this point indicates we'll be able to mine 250 GHash/s in a single rack of mining units using 5kW of power. When you consider that this represents the computational power of about 400 AMD Radeon 6990 GPUs (which would consume close to 200 kW), you can immediately see the benefit of ASICs for Bitcoin mining.

Benefits

  • Low power cost


Challenges

  • Requires design and manufacture of a custom chip for a specific purpose.
  • Not flexible
  • Large upfront cost
  • Slow time to market
  • Memory-based Crypto Algorithms are a problem (eg. Ethers)


Related