Proof-of-work difficulty increasing
We had our first automatic adjustment of the proof-of-work difficulty on 30 Dec 2009.
The minimum difficulty is 32 zero bits, so even if only one person was running a node, the difficulty doesn’t get any easier than that. For most of last year, we were hovering below the minimum. On 30 Dec we broke above it and the algorithm adjusted to more difficulty. It’s been getting more difficult at each adjustment since then.
The adjustment on 04 Feb took it up from 1.34 times last year’s difficulty to 1.82 times more difficult than last year. That means you generate only 55% as many coins for the same amount of work.
The difficulty adjusts proportionally to the total effort across the network. If the number of nodes doubles, the difficulty will also double, returning the total generated to the target rate.
For those technically inclined, the proof-of-work difficulty can be seen by searching on “target:” in debug.log. It’s a 256-bit unsigned hex number, which the SHA-256 value has to be less than to successfully generate a block. It gets adjusted every 2016 blocks, typically two weeks. That’s when it prints “GetNextWorkRequired RETARGET” in debug.log.
minimum 00000000ffff0000000000000000000000000000000000000000000000000000
30/12/2009 00000000d86a0000000000000000000000000000000000000000000000000000
11/01/2010 00000000c4280000000000000000000000000000000000000000000000000000
25/01/2010 00000000be710000000000000000000000000000000000000000000000000000
04/02/2010 000000008cc30000000000000000000000000000000000000000000000000000
14/02/2010 0000000065465700000000000000000000000000000000000000000000000000
24/02/2010 0000000043b3e500000000000000000000000000000000000000000000000000
08/03/2010 00000000387f6f00000000000000000000000000000000000000000000000000
21/03/2010 0000000038137500000000000000000000000000000000000000000000000000
01/04/2010 000000002a111500000000000000000000000000000000000000000000000000
12/04/2010 0000000020bca700000000000000000000000000000000000000000000000000
21/04/2010 0000000016546f00000000000000000000000000000000000000000000000000
04/05/2010 0000000013ec5300000000000000000000000000000000000000000000000000
19/05/2010 00000000159c2400000000000000000000000000000000000000000000000000
29/05/2010 000000000f675c00000000000000000000000000000000000000000000000000
11/06/2010 000000000eba6400000000000000000000000000000000000000000000000000
24/06/2010 000000000d314200000000000000000000000000000000000000000000000000
06/07/2010 000000000ae49300000000000000000000000000000000000000000000000000
13/07/2010 0000000005a3f400000000000000000000000000000000000000000000000000
16/07/2010 000000000168fd00000000000000000000000000000000000000000000000000
27/07/2010 00000000010c5a00000000000000000000000000000000000000000000000000
05/08/2010 0000000000ba1800000000000000000000000000000000000000000000000000
15/08/2010 0000000000800e00000000000000000000000000000000000000000000000000
26/08/2010 0000000000692000000000000000000000000000000000000000000000000000
date, difficulty factor, % change
2009 1.00
30/12/2009 1.18 +18%
11/01/2010 1.31 +11%
25/01/2010 1.34 +2%
04/02/2010 1.82 +36%
14/02/2010 2.53 +39%
24/02/2010 3.78 +49%
08/03/2010 4.53 +20%
21/03/2010 4.57 +9%
01/04/2010 6.09 +33%
12/04/2010 7.82 +28%
21/04/2010 11.46 +47%
04/05/2010 12.85 +12%
19/05/2010 11.85 -8%
29/05/2010 16.62 +40%
11/06/2010 17.38 +5%
24/06/2010 19.41 +12%
06/07/2010 23.50 +21%
13/07/2010 45.38 +93%
16/07/2010 181.54 +300%
27/07/2010 244.21 +35%
05/08/2010 352.17 +44%
15/08/2010 511.77 +45%
26/08/2010 623.39 +22%
14/02/2010 0000000065465700000000000000000000000000000000000000000000000000
2009 1.00
30/12/2009 1.18 +18%
11/01/2010 1.31 +11%
25/01/2010 1.34 +2%
04/02/2010 1.82 +36%
14/02/2010 2.53 +39%
Another big jump in difficulty yesterday from 1.82 times to 2.53 times, a 39% increase since 10 days ago. It was 10 days apart not 14 because more nodes joined and generated the 2016 blocks in less time.
[Edit: I later found that I was generating quite a bit more than that, just didn’t realize it because of the “matures in xx more blocks” concept. I still think it will be a major headache when the difficulty significantly increases though. I apologize for my silliness Smiley]
Satoshi, I figured it will take my modern core 2 duo about 20 hours of nonstop work to create 50.00! With older PCs it will take forever. People like to feel that they “own” something as soon as possible, is there a way to make the generation more divisible? So say, instead of making 50 every 20 hours, make 5 every 2 hours?
I don’t know if that means reducing the block size or reducing the 120-block threshold to say 12-block only or what, but because the difficulty is increasing I can imagine that a year from now the situation will be even worse (3+ weeks until you see the first spendable coins!) and we better find a solution for this ASAP.
I would like to comment that as of late, it seems almost as if I am generating nearly no Bitcoins. Indeed, my rate of acquisition seems to be greater than ten times slower. If I cannot stay online for about fourteen consecutive hours (very hard to do on a satellite connection!), I actually get nothing at all.
How this exactly relates to the difficulty adjustments is beyond my knowledge; I offer this feedback as a kind of “field report”.
I generated 5 blocks today on my Pentium processor. Two of them were within 3 minutes of each other.
I have noticed some slowdown since the adjustment, but I still generate a lot of coins. My computer is off while I’m sleeping, and BitCoin bootstraps quickly when I turn it back on. Do you guys-who-are-having-trouble have the BitCoin port open?
My port is open, both in my software and hardware firewall. My router is handling it appropriately. Perhaps it has to do with my connection’s very high latency (2000ms or more on average) and/or my high packet loss (sometimes up to 10% loss)?
Quote from: Suggester on February 16, 2010, 02:15:49 AM
Satoshi, I figured it will take my modern core 2 duo about 20 hours of nonstop work to create ฿50.00! With older PCs it will take forever. People like to feel that they “own” something as soon as possible, is there a way to make the generation more divisible? So say, instead of making ฿50 every 20 hours, make ฿5 every 2 hours?
I thought about that but there wasn’t a practical way to do smaller increments. The frequency of block generation is balanced between confirming transactions as fast as possible and the latency of the network.
The algorithm aims for an average of 6 blocks per hour. If it was 5 bc and 60 per hour, there would be 10 times as many blocks and the initial block download would take 10 times as long. It wouldn’t work anyway because that would be only 1 minute average between blocks, too close to the broadcast latency when the network gets larger.
Quote from: Sabunir on February 16, 2010, 08:51:51 AM
. Perhaps it has to do with my connection’s very high latency (2000ms or more on average)
2 seconds of latency in both directions should reduce your generation success by less than 1%.
Quote from: Sabunir on February 16, 2010, 08:51:51 AM
and/or my high packet loss (sometimes up to 10% loss)?
Probably OK, but I’m not sure. The protocol is designed to resync to the next message, and messages get re-requested from all the other nodes you’re connected to until received. If you miss a block, it’ll also keep requesting it every time another blocks comes in and it sees there’s a gap. Before the original release I did a test dropping 1 out of 4 random messages under heavy load until I could run it overnight without any nodes getting stuck.
How do you adjust this difficulty, anyway? (Administrating a decentralized system?) And what would prevent an attacker from setting the difficulty very low or very high to interfere with the system?
Quote from: Sabunir on February 21, 2010, 04:58:44 PM
How do you adjust this difficulty, anyway? (Administrating a decentralized system?) And what would prevent an attacker from setting the difficulty very low or very high to interfere with the system?
My understanding is that every Bitcoin client has the same algorithm (formula) built into it to automatically adjust the difficulty every so many blocks. Not only that, but I think that Bitcoin will not accept blocks generated at a different difficulty, so if a modified Bitcoin client tried to send out more easily generated blocks, all the authentic clients would reject the fake blocks.
The automatic adjustment happened earlier today.
24/02/2010 0000000043b3e500000000000000000000000000000000000000000000000000
24/02/2010 3.78 +49%
I updated the first post.
The formula is based on the time it takes to generate 2016 blocks. The difficulty is multiplied by 14/(actual days taken). For instance, this time it took 9.4 days, so the calculation was 14/9.4 = 1.49. Previous difficulty 2.53 * 1.49 = 3.78, a 49% increase.
I don’t know what you’re talking about accepting easier difficulties.
Maybe someone with a little background in this statistics/math stuff can shed some light on this..
The way this thing works is it takes a (basically random) block of data and alters a 32 bit field inside it by starting at 1 and incrementing. The block of data also contains a timestamp and that’s incremented occasionally just to keep mixing it up (but the incrementing field isn’t restarted when the timestamp is update). If you get a new block from the network you sort of end up having to start over with the incrementing field at 1 again.. however all the other data changed too so it’s not the same thing you’re hashing anyway.
The way I understand it, since the data that’s being hashed is pretty much random and because the hashing algorithm exhibits the ‘avalanche effect’ it probably doesn’t matter if you keep starting with 1 and incrementing it or if you use pseudo random values instead, but I was wondering if anyone could support this or disprove it.
Can you increase your likelihood of finding a low numerical value hash by doing something other than just sequentially incrementing that piece of data in the input? Or is this equivalent to trying to increase your chances of rolling a 6 (with dice) by using your other hand?
Quote from: laszlo on May 11, 2010, 01:13:07 PM
The way I understand it, since the data that’s being hashed is pretty much random and because the hashing algorithm exhibits the ‘avalanche effect’ it probably doesn’t matter if you keep starting with 1 and incrementing it or if you use pseudo random values instead, but I was wondering if anyone could support this or disprove it.
Yep, your understanding here is correct. It does not matter what exactly gets hashed, and no, you can’t cheat without first breaking SHA-256, which is considered difficult.
The salient property of cryptographic hash functions is that they are as random as is possible while still being deterministic. That’s what their strength depends on — after all if they weren’t random, if there were obvious patterns, they could be broken that way. So the ideal hash function behaves just like a random number generator. It does not matter what you feed in, timestamp or not, whatever’s put in there, the hash should still behave randomly (i.e. every possible outcome has the same a-priori probability of occuring). Incrementing by one works just as well as completely changing everything every step (this follows from the avalanche property). However, the initial value, before you start incrementing, must be (pseudo-)randomly chosen, or every computer will start at the same point, and the fastest one always wins, which is not what is wanted here.
A nice addition to the GUI would be an estimate of how many hashes/sec it’s computing. Either present this as a raw number or a “you can expect to generate X packs of bitcoins per week.”
This might partially solve the frustration of new users not getting any Bitcoins right away.
That’s a good idea. I’m not sure where exactly to fit that in, but it could certainly calculate the expected average time between blocks generated, and then people would know what to expect.
Every node and each processor has a different public key in its block, so they’re guaranteed to be scanning different territory.
Whenever the 32-bit nonce starts over at 1, bnExtraNonce gets incremented, which is an arbitrary precision integer.
The main part of the formula that I’m uneasy about is the “target probability” of 0.5. 0.5 is used for calculations involving brute-forcing passwords, but maybe this is different. If your blocks consistently take double the amount of time that the formula predicts, use 1 instead.
Quote from: theymos on June 05, 2010, 03:02:57 PM
The main part of the formula that I’m uneasy about is the “target probability” of 0.5. 0.5 is used for calculations involving brute-forcing passwords, but maybe this is different. If your blocks consistently take double the amount of time that the formula predicts, use 1 instead.
I thought about that. I don’t know if 0.5 is valid or not. I’ll continue to take observations. I wonder if it writes to the debug log when it has success.
Actually, that formula assumes that you’re working on one block until you figure it out. In Bitcoin, aren’t multiple nodes working on the same block? When one finishes, the others abandon work on it and choose another block? That was the impression that I had, but it might be wrong.
Quote from: lachesis on June 05, 2010, 03:28:29 PM
I thought about that. I don’t know if 0.5 is valid or not. I’ll continue to take observations. I wonder if it writes to the debug log when it has success.
Use a value of 1, not 0.5. Suppose max=100 and target=10, then 10 out of every 100 hashes will be at or below the target, so your success rate will be 10% NOT 5%.
At the moment target/max = 1.5x10^-11 (target=0x000000000f, which is 36 zeros, so you basically need to throw a dice with 2^36=69 billion sides, and wait until you get a 1), and you’re doing 1 million x 86400 = 86.4 billion hashes per day, so you can expect slightly more than one success per day.
It’s VERY important to realise that this is the AVERAGE bitcoin creation time, and will only be valid over periods longer than about a week or so. Because a success event is completely random (I hope, otherwise the hash function is probably not secure and someone will eventually crack it, and therefore bitcoin!), the interval between one success and the next will follow a Poisson distribution with n=0, i.e. an exponential (see wikipedia). Therefore, with an average rate of, say, 1 success per day, you can expect that roughly 10% of the time, you’ll have to wait 2� days or more, 1% of the time 4� days, 0.1% 7 days and so on.
I integrated the hashmeter idea into the SVN version. It displays khash/s in the left section of the status bar.
Two new log messages:
21/06/2010 01:23 hashmeter 2 CPUs 799 khash/s
21/06/2010 01:23 generated 50.00
grep your debug.log for “generated” to see what you’ve generated, and grep for “hashmeter” to see the performance. On windows, use: findstr “hashmeter generated” “%appdata%\bitcoin\debug.log”
I have the hashmeter messages once an hour. How often do you think it should be?
[Deleted] Quote from: davidonpda on June 22, 2010, 02:55:37 PM
How about in the options menu you can turn it off or on, and specify an interval in minutes for how often it should display?
I say keep it simple; more choices isn’t always better, it just makes it overwhelming and confusing for most users.
Agree. Certainly too trivial to clutter the user’s attention with.
I changed it to every 30 minutes.
If I increased it to every 10 minutes, it would still be a small enough presence in the log file. Question is whether that would be more output than the user wants when they grep.
I’d be interested in seeing something like “expected bitcoins generated/day” next to (or in place of) the khash/s number. I’d rarely need to see the khash/s number since that won’t change unless I make changes to the software or hardware.
Quote from: gould on July 13, 2010, 09:30:59 PM
I’d be interested in seeing something like “expected bitcoins generated/day” next to (or in place of) the khash/s number. I’d rarely need to see the khash/s number since that won’t change unless I make changes to the software or hardware.
You can use the calculator at: http://www.alloscomp.com/bitcoin/calculator.php
If this is a feature request post in the Development & Technical Discussion Forum: http://bitcointalk.org/index.php?board=6.0
13/07/2010 0000000005a3f437d4a7f529fd4a7f529fd4a7f529fd4a7f529fd4a7f529fd4a
The proof-of-work difficulty is currently 45.38. (see http://www.alloscomp.com/bitcoin/calculator.php)
It’s about to increase again in a few hours. It’s only been 3-4 days since the last increase, so I expect it will increase by the max of 4 times, or very nearly the max. That would put it at 181.54.
The target time between adjustments is 14 days, 14/3.5 days = 4.0 times increase.
Quote from: satoshi on July 16, 2010, 02:46:12 PM
The proof-of-work difficulty is currently 45.38. (see http://www.alloscomp.com/bitcoin/calculator.php)
It’s about to increase again in a few hours. It’s only been 3-4 days since the last increase, so I expect it will increase by the max of 4 times, or very nearly the max. That would put it at 181.54.
The target time between adjustments is 14 days, 14/3.5 days = 4.0 times increase.
Holy…
Satoshi, what happens if the rush dries up for a bit; some of the slashdotters or whoever get tired? Does the difficulty ever go back down?
Quote from: Bitcoiner on July 16, 2010, 02:48:54 PM
Quote from: satoshi on July 16, 2010, 02:46:12 PM
The proof-of-work difficulty is currently 45.38. (see http://www.alloscomp.com/bitcoin/calculator.php)
It’s about to increase again in a few hours. It’s only been 3-4 days since the last increase, so I expect it will increase by the max of 4 times, or very nearly the max. That would put it at 181.54.
The target time between adjustments is 14 days, 14/3.5 days = 4.0 times increase.
Holy…
Satoshi, what happens if the rush dries up for a bit; some of the slashdotters or whoever get tired? Does the difficulty ever go back down?
If I’m reading the source code correctly, it should go up and down based on how much CPU is being thrown at it. So if someone rented a super computer to drive up the difficulty for a week, then it vanished, the difficulty should float back down.
It adjusted to 181.54 a few minutes ago. Typical time to get a block is about a week now.
The difficulty can adjust down as well as up.
The network should be generating close to 6 blocks per hour now.
Quote from: gould on July 13, 2010, 09:30:59 PM
I’d be interested in seeing something like “expected bitcoins generated/day” next to (or in place of) the khash/s number. I’d rarely need to see the khash/s number since that won’t change unless I make changes to the software or hardware.
I think the web c alc does a good job by showing likelyhoods based on khash speed: http://www.alloscomp.com/bitcoin/calculator.php
That way you can see there is no guaranteed time horizon.
Quote from: satoshi on July 16, 2010, 04:56:54 PM
It adjusted to 181.54 a few minutes ago. Typical time to get a block is about a week now.
The difficulty can adjust down as well as up.
The network should be generating close to 6 blocks per hour now.
Yeah, I’ve noticed the “10 second blocks” are gone, replaced with 419 and 741 second block generation with no more in the last 20 minutes. That should keep those server farms on hold for a while Wink
Now, correct me if I’m wrong, but now that block generation is taking a lot longer, doesn’t that mean that the lucky person who got the block is going to take a lot longer to be verified by the network that he/she was the winner before they could ever spend it?
Yes, about 20 hours. (120 conf / 6 blocks per hour = 20 hours) That’s the normal length of time before you can spend it. You know long before that that you won one.
Quote from: satoshi on July 16, 2010, 05:29:28 PM
Yes, about 20 hours. (120 conf / 6 blocks per hour = 20 hours) That’s the normal length of time before you can spend it. You’ll know long before that that you won one.
So if the difficulty was increased so high that it took a day to find a winning block, that means the lucky winner would have to wait 120 day before they could spend it or about 4 months if everyone else was averaging about the same speed? Seems like at the high end of the difficulty, there is an issue with coin generation vs. being able to put it into circulation by spending. Wouldn’t the long delay cause a lot of generated coin to be lost because anything could happen to the PC that won in a long amount of time if the winner had to really wait that long? They might un-install the program or the computer get eaten by a virus or power surge well before then.
Quote from: knightmb on July 16, 2010, 05:33:57 PM
Quote from: satoshi on July 16, 2010, 05:29:28 PM
Yes, about 20 hours. (120 conf / 6 blocks per hour = 20 hours) That’s the normal length of time before you can spend it. You’ll know long before that that you won one.
So if the difficulty was increased so high that it took a day to find a winning block, that means the lucky winner would have to wait 120 day before they could spend it or about 4 months if everyone else was averaging about the same speed? Seems like at the high end of the difficulty, there is an issue with coin generation vs. being able to put it into circulation by spending. Wouldn’t the long delay cause a lot of generated coin to be lost because anything could happen to the PC that won in a long amount of time if the winner had to really wait that long? They might un-install the program or the computer get eaten by a virus or power surge well before then.
I think that the overall network is generating the same amount of blocks regardless of the difficulty; the difficulty is intended so that the network generates a block in a relatively constant amount of time. Therefore, this confirmation time should always be around the same.
Satoshi or anyone else can correct me if I’m wrong Smiley
Right, the difficulty adjustment is trying to keep it so the network as a whole generates an average of 6 blocks per hour. The time for your block to mature will always be around 20 hours.
The recent adjustment put us back to close to 6 blocks per hour again.
There’s a site where you can see the time between blocks, and since block 68545, it’s been more like 10 minutes per block: http://nullvoid.org/bitcoin/statistix.php
In the Economy subforum, I have just written a post titled “Get rid of ‘difficulty’ and maintain a constant rate” which outlines a scheme which a new version of the BitCoin software could use to keep the rate of block generation absolutely constant at the cost of a slight increase in network traffic.
I would be very grateful for your comments.
ByteCoin
I value Bitcoin as an anonymous digital currency. Although I’m not expecting to get rich, I’d like the ability to continuously generate enough Bitcoin to purchase desired services.
Is there any expectation that economic value per khash/sec (or client) per day will be at least somewhat stable? Difficulty just increased 300%, and USD/Bitcoin just increased about 500% (although that may turn out to be a spike). I do get that there’s no necessary relationship. However, perhaps there’s an economic basis for one (however approximate it might be).
New difficulty factor 244.213223092
+35%
I updated the first post.
date, difficulty factor, % change
2009 1.00
30/12/2009 1.18 +18%
11/01/2010 1.31 +11%
25/01/2010 1.34 +2%
04/02/2010 1.82 +36%
14/02/2010 2.53 +39%
24/02/2010 3.78 +49%
08/03/2010 4.53 +20%
21/03/2010 4.57 +9%
01/04/2010 6.09 +33%
12/04/2010 7.82 +28%
21/04/2010 11.46 +47%
04/05/2010 12.85 +12%
19/05/2010 11.85 -8%
29/05/2010 16.62 +40%
11/06/2010 17.38 +5%
24/06/2010 19.41 +12%
06/07/2010 23.50 +21%
13/07/2010 45.38 +93%
16/07/2010 181.54 +300%
27/07/2010 244.21 +35%