Lost and hodled - understanding Bitcoin’s 'Unspent ...

Lost and hodled - understanding Bitcoin’s 'Unspent Transactions'

Lost and hodled - understanding Bitcoin’s 'Unspent Transactions' submitted by ulros to fbitcoin [link] [comments]

Discussion: Lost Bitcoins are a good thing, they increase Bitcoins Value

submitted by RealSirJoe to Bitcoin [link] [comments]

Who wants to be a billionaire!

Great opportunity available here, first come, first served.
I am looking for someone to that I can promise to give a million bitcoins. I'll trawl the 'chain and come up with a bunch of currently unspent coinbases (a soon-to-be-precedent case will establish that it just doesn't matter if any of them move in the future).
I won't actually give them to you, though, so you are going to have to sue me. Yeah, that's like 400 bucks in filing fees alone, can't make money without spending money. This is like a guaranteed 1000000x though.
Here's how it'll work, see, I promised to give you these coins, but here's the key: I won't! I'll even admit that I didn't straight up. Breach of promise. We'll even work in something like you sent me a hat or whatever, for consideration. And I'll swear in court I mined those coins. I'll swear, like, really hard, which means, in a court of law, that I must be telling the truth, and I'll even mention a lot of witnesses, who, for reasons of privilege such as priest-penitent, doctor-patient, lawyer-client, spousal, and dire complications of admiralty law (where the captain said I cannot make it happen), cannot be produced. I might even cry a little!
Then I'll lose, because everyone here knows I made this promise and I won't deny it.
Then, the court will simply order that miners award all these coins to you, because I lost, and you'll be rich!
Foolproof, right? This strategy is endorsed by renown legal expert Mr. Wright, and I challenge anyone to demonstrate how my fact-pattern deviates from his theory as applied in his empirical demonstration that will be an assured success.
submitted by Annuit-bitscoin to bsv [link] [comments]

[Question] What happens when people die with btc in their wallets

I been wondering if private keys are never recovered certain wallets are never obtained, what happens to the price, and the blockchain as a whole?
submitted by tycooperaow to Bitcoin [link] [comments]

8000 in ether stolen

I wonder if someone can help with this one. I have a Trezor wallet. I was connect through My Trezor to myetherwallet.com. IT asked for my pin . I went to bed around 1am. I forgot to disconnect my hardware device from the computer. That morning at 6am 8000 worth of ether was moved out of my ether wallet to an address for Free Wallet. I understand i cannot recover my ether but does anyone know how this could happen ? My hardware was connected but how could they have done this without my pin?
submitted by Jtumosa to MyEtherWallet [link] [comments]

Satoshi's unmoved coins are the world's biggest prize in quantum-decryption, the canary in bitcoin's quantum coalmine -u/Anenome5

From this post: /Nullc explained that in the early years, mined bitcoin was paid to the pubkey, not the pubkey-hash.
I was used to the idea that any address that hadn't been spent from was considered quantum-safe. But this isn't true for any coins that were mined and not moved prior to 2012.
What this means is that all of Satoshi's coins are theoretically stealable by anyone who can pull off a successful quantum attack on bitcoin.
In fact, we must now consider them the canary in bitcoin's quantum coalmine because they will likely be the first to fall.
Anyone who can pull off a successful quantum attack on these early unmoved coins will make over $500 million dollars. Today.
Everyone will think Satoshi is moving his coins, but in fact it will more likely be a quantum attacker, and that is a shame, unless Satoshi himself wizes up and acts soon.
Beyond that, a successful quantum attack may allow someone to masquerade as Satoshi by giving them the private key to these original coins.
The day is quickly approaching where even if someone were to sign a message using Satoshi's known coin hoard addresses, we should think twice about whether this person actually is Satoshi or not, since it may not be long before a successful quantum attack will make his early addresses vulnerable to exposure.
Now this vulnerability changed in 2012, so current mining to an unspent address is, thankfully, safe. And if you have an address with coins in it that has never been spent from, you are also quantum safe.
I just fear we are in for more Satoshi-hoaxing and drama due to these old addresses. And if Satoshi's coins ever move, we should consider it likely that the quantum nut has finally been cracked by someone and we'll need to be more careful about address reuse.
There may be one other issue. There may be a lot of 2012 mined coined that has never been spent. Right now we consider much of this coin to be simply lost.
But in the near future, quantum cryptographers may be able to recover much of this coin and make perhaps another $500 million or so.
A billion dollar prize for the quantum researchers out there. Not a bad plum if you ask me.
submitted by parakite to Bitcoin [link] [comments]

What if?

I love BTC and am a huge proponent. However, I am concerned about 1 problem for BTC to be used as a legit currency. What happens if everyone loses their private keys as so many have already been lost? Human error is a factor and over time there will be plenty of BTC lost to user error. How can a currency thrive if it is vulnerable to such a problem by not being able to recirculate lost coins? I know BTC can be divided into lots of decimals but in order for a currency to have legitimacy its coins cannot be lost forever due to error.
submitted by Skillville to Bitcoin [link] [comments]

Would it be possible to have an altcoin that somehow replenishes "lost" coins back into the system?

I've read that 4 million Bitcoins are lost. I'm curious if it's possible to have some kind of system where coins that have been "lost" (how ever that would be measured), return to the system. Although highly improbably, what if there were 100 coins of something and 99 were lost and only 1 of the coins (and the breakdowns of it) are left to be used? I know having a finite supply is an aspect of not being inflationary (at least the same way fiat is) and then the other side is having infinite (like Dogecoin). But what if something was in the middle? Maybe it exists already or there's a specific keyword for this. Not sure. Just curious. Would there be a benefit to this?
submitted by TheBloodEagleX to CryptoCurrency [link] [comments]

Proposal: re-issue old unspent coins

Preamble


Hello everyone!
Before you start downvoting me, I know this is a proposal that has been submitted many times already, and that there is a strong opposition to this idea. But please, let me expose my arguments and how I imagine this change. I've read a lot of the previous posts and saw a lot of good points on both sides, but I still believe that discussing this idea is worth the time. You have the right to not agree, and if it is the case, please expose your arguments. I'm not here to enforce my idea, I want to share it with you all, have a constructive debate and contribute to the thinking process of making Bitcoin the best it can be. The outcome of this discussion can only be positive in my eyes, as sharing knowledge and opinions is enlightening for everyone.
The topic I'm going to discuss is a forecasting of possible future problems and a proposal to solve them. We can't know for sure how the future will unfold and if these problems will really happen, only time will tell. However, it is important to think about their possibility and come up with a solution before they even happen. The first step is to discuss about the likeliness of their happening. Then we can imagine possible solutions.
I know this post is long, but please read it in its entirety before answering. I will be covering several points in an ordered manner to avoid mixing everything up and be as clear as I can.
With that said, let's start.

A lot of coins are lost, and more will be

The main problem I want to address is lost coins. There will always be a maximum of 21 million bitcoins as you all know. However, a huge amount of coins have been lost in the past already, and more are lost every day. This is not yet a problem, as there is still plenty for everyone despite the scarcity, and also because a good amount is still issued with every new block. Our system is still practical. But as time goes by, less and less bitcoins will be available and usable. In a very long time, there might not be a single satoshi available anymore. This is a bit extreme, but I meant to highlight the fact that the current system is not sustainable in the very long term in my opinion.
Having less bitcoins available increase scarcity and drive the prices up, but it becomes impractical as well. Exchanges could not keep as many coins, dry up and you won't be able to get into the network that way anymore, especially if you're not a trader. I'm talking about so much scarcity that even a single satoshi is worth a lot. We are limited to 10e-8. (Please bear with me, I know it's been suggested to increase the amount of decimals, but I am just exposing the problem for now, not proposing solutions)

Mining will become less profitable

Mining reward decrease with each halving, and eventually, miner will only be rewarded with the fees. This is a side-problem. By that I mean that this is a related but less important point in my argumentation.
Miners need an incentive to mine, and this activity should be profitable, otherwise they would stop. No miners, no network. Will fees be enough to keep them mining? Will fees become incredibly high because of that? Will people still use the network if the fees are so high? I don't have the answer to these questions and it's harder to foresee than the lost coins. Anyway, lower fees are desirable for the users, and higher rewards are desirable for the miners. Any change that could reinforce this statement is welcome.

Proposal: invalidate and re-issue very old unspent UTXOs

To solve the problem of lost coins and too much scarcity, I suggest that very old unspent UTXOs can be invalidated and re-issued as mining rewards.
I understand that it can rightfully be seen as a theft. This is why I want to try to find a balance so more than 99% of re-issued coins are actually lost. I thought that an expiration time of 100 years (about the time of a long life) would be enough to consider that the coins are lost. It would also be enough in the case of a deceased person who didn't give the recovery phrase to their relatives. It is quite unlikely that holdings stay at the same place for so long. We're talking about a long lifetime!
Another way to increase the confidence in re-issuing actually lost coins is to implement a heartbeat into wallets so they move the UTXOs which are going to expire automatically to keep control. There are however legit concerns for cold storage, which would require user actions to trigger the heartbeat. But keep in mind that this heartbeat would probably never be needed in your entire life as the expiration time is so long. Users could also do this heartbeat themselves if they want to of course.
To avoid the miners to censor these transactions in order to try to get more profit (and actually stealing coins for that matter), this heartbeat would be done several years before expiration. I think that there wouldn't be so much incentive to censor these transactions because the profit from the censored heartbeat would come a very long time later. They would rather take the fees from the heartbeat transaction instead.

As a bonus, miners would get more than the fees as their reward. It would help keeping the fees lower and keeping the miners mine. Of course this is not a real solution for this problem, this is just a fortunate side-effect of the re-issuing. The system should not rely on that alone to sustain the network security.

Some people are against this because re-issuing lost coins would decrease scarcity and drive the prices down. I disagree with this statement. The hard limit of 21 million bitcoins will still be there, there will still be scarcity, and it will remain practical. No new coin will be issued. This core principle is kept.

Technically, the following change to the consensus rules would be needed: an unsigned transaction is valid if the inputs are spending UTXOs older than the expiration time or if the transaction has no outputs (everything goes to the miner).

I know there is a strong opposition to this idea among the Bitcoiners, probably because it is quite in contradiction with one of the core principles of the protocol: you are the only one controlling your money. I understand this point of view and I agree with it. This change would indeed create a way in which your coins can become someone else's without your consent. But as everything in life, no solution is perfect and can be either terribly bad or acceptable, depending on the conditions and if a balance has been found or not. I think that the 100 years expiration time plus heartbeat is a fair proposal.

Compared to increasing the decimals

I saw another idea while reading the previous submissions: increase the maximum amount of decimals. This solution would remove the problem of too much scarcity.
I believe that it is just as much in contradiction with the core principles of Bitcoin than re-issuing. That would mean that scarcity doesn't really have any sense anymore, and that we could just print more money, just like fiat.
This is not a bad idea by any means, but it's also an idea that sacrifices something. I think that the price to pay is way higher though.

Both solutions would require a hard fork. I've been proved wrong in the comments: allowing more decimals would apparently not require a hard fork.
However, taking the long expiration time into consideration, re-issuing would not need a hard fork if it's widely accepted and supported by the community. The oldest possible UTXO is currently 10 years old. That means that it could be re-issued in 90 years at minimum. This time span is way enough for the network to implement and spread the change without it taking effect. So when the first expired UTXO is re-issued, everyone in the network would already handle it (again, assuming the change is accepted) and thus, no hard fork would occur. On the other hand, adding another decimal would require a hard fork right away.

Another advantage of coin expiration and re-issuance is that it would prevent the UTXO database to ground unbound. Any unbounded database is not sustainable in the long term. Re-issued UTXOs are not new UTXOs. Adding decimals creates new UTXOs and opens the door to a potentially infinitely large database.

Why not submit the idea to an altcoin?

I'm foreseeing this question being asked to me. I believe in Bitcoin more than any other project when it comes to decentralized money. My aim is to try to make it the best I think it can be, not for the glory of having contributed to it, nor just for the sake of having my idea implemented somewhere. I want it to have meaning, to be relevant. If the community doesn't like the idea, so be it. I won't make another pointless hard fork. I understand that there must be consensus and if there is not, why trying so hard?

Conclusion

In conclusion, I am certain that we will face a problem one day or another regarding lost coins. There are solutions, but none of them is very good nor have support from the community. If we want a robust and sustainable decentralized digital money, we have to make a choice and compromise. Would you rather protect your short term interests in Bitcoin or have it change the world in the long term?

Now let's talk! I'm eagerly waiting for your responses. Please remain civil, expose your opinion without worrying about being downvoted, give arguments, question everything.
submitted by ImAFlyingPancake to Bitcoin [link] [comments]

How many bitcoins have been lost, abandoned, and will never be retrieved?

Is there a way to discover what this figure is?
Doesn't this amount of missing bitcoins matter? Everyone includes the lost coins as part of the system and part of the collection of total bitcoins... but they're not, they're gone.
submitted by Nose_Grindstoned to Bitcoin [link] [comments]

TIL in 2011 a user running a modified mining client intentionally underpaid himself 1 satoshi, which is the only time bitcoin have ever truly been destroyed.

In block 124724 you'll find txid 5d80a29b which has a payout of 49.99999999 BTC at a time when the block reward was 50 BTC. A transaction fee of 0.01 BTC was also forfeited. This bitcoin no longer exists anywhere in the network, as opposed to "burned" coins which technically still exist in a wallet which no one can ever access (ex: 1BitcoinEaterAddressDontSendf59kuE).
On bitcointalk user midnightmagic explains a deeper meaning behind this:
I did it as a tribute to our missing Satoshi: we are missing Satoshi, and now the blockchain is missing 1 Satoshi too, for all time.
EDIT: Users have pointed out in the comments that this isn't actually the only time coins have been destroyed, there are actually several different ways coins have been destroyed in the past. sumBTC also points out that the satoshi wasn't destroyed-- it was never created in the first place.
Another interesting way to destroy coins is by creating a duplicate transaction. This is again done with a modified client. For example see block 91722 and block 91880. They both contain txid e3bf3d07. The newer transaction essentially overwrites the old transaction, destroying the previous one and the associated coins. This issue was briefly discussed on Github #612 and determined to not be a big deal. In 2012 they realized that duplicated transactions could be used as part of an attack on the network so this was fixed and is no longer possible.
Provably burning coins was actually added as a feature in 2014 via OP_RETURN. Transactions marked with this opcode MAY be pruned from a client's unspent transaction database. Whether or not these coins still exist is a matter of opinion.
Finally, at least 1,000 blocks forfeited transactions fees due to a software bug. Forfeited transaction fees are lost forever and are unaccounted for in any wallet.
Further reading: https://bitcoin.stackexchange.com/questions/30862/how-much-bitcoin-is-lost-on-average/30864#30864 https://bitcoin.stackexchange.com/questions/38994/will-there-be-21-million-bitcoins-eventually/38998#38998
submitted by NewLlama to Bitcoin [link] [comments]

Age of moving coins as a predictor of the bottom

Research suggests that the age of coins being spent – in particularly the 1 year+ UTXO – is on the rise.
Holders are getting stronger: over half of all bitcoin hasn’t been moved in a year. 20% hasn’t moved for five years, and much of that may be lost. Delphi Digital uses this information to forecast the bottom of the market is close.
Before we go further, a quick definition of terms is in order. Every time bitcoins are ‘spent’ – i.e. moved – new outputs are created on the blockchain. These outputs are chunks of coins that may be combined in a new transaction, or sent individually; it doesn’t matter. Every bitcoin transaction has inputs – coins owned by the sender – and outputs, or the result(s) of that transaction. The outputs are known as Unspent Transaction Outputs, or UTXO.
The UTXOs used for a new transaction may have been sat in their addresses for just a few minutes, or they could have been hodler coins left dormant for years. Thanks to the nature of the blockchain, and the fact that every transaction is timestamped, we can gain some very interesting insights from the ages of coins being spent. For example, if a coin hasn’t moved for five years, you know the owner acquired it back when prices were a fraction of what they were today. It’s not a precise science, but it gives us a broad sense of the initial value of coins being moved. And that, in turn, gives us a sense of whether hodlers might be tempted to sell at current prices.
https://cryptoinferno.org/news/age-moving-coins-predictor-bottom/
submitted by Kylew88 to BitcoinMarkets [link] [comments]

Age of moving coins as a predictor of the bottom

Research suggests that the age of coins being spent – in particularly the 1 year+ UTXO – is on the rise.
Holders are getting stronger: over half of all bitcoin hasn’t been moved in a year. 20% hasn’t moved for five years, and much of that may be lost. Delphi Digital uses this information to forecast the bottom of the market is close.
Before we go further, a quick definition of terms is in order. Every time bitcoins are ‘spent’ – i.e. moved – new outputs are created on the blockchain. These outputs are chunks of coins that may be combined in a new transaction, or sent individually; it doesn’t matter. Every bitcoin transaction has inputs – coins owned by the sender – and outputs, or the result(s) of that transaction. The outputs are known as Unspent Transaction Outputs, or UTXO.
The UTXOs used for a new transaction may have been sat in their addresses for just a few minutes, or they could have been hodler coins left dormant for years. Thanks to the nature of the blockchain, and the fact that every transaction is timestamped, we can gain some very interesting insights from the ages of coins being spent. For example, if a coin hasn’t moved for five years, you know the owner acquired it back when prices were a fraction of what they were today. It’s not a precise science, but it gives us a broad sense of the initial value of coins being moved. And that, in turn, gives us a sense of whether hodlers might be tempted to sell at current prices.
https://cryptoinferno.org/news/age-moving-coins-predictor-bottom/
submitted by Kylew88 to Bitcoin [link] [comments]

Merkle Trees and Mountain Ranges - Making UTXO Set Growth Irrelevant With Low-Latency Delayed TXO Commitments

Original link: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-May/012715.html
Unedited text and originally written by:

Peter Todd pete at petertodd.org
Tue May 17 13:23:11 UTC 2016
Previous message: [bitcoin-dev] Bip44 extension for P2SH/P2WSH/...
Next message: [bitcoin-dev] Making UTXO Set Growth Irrelevant With Low-Latency Delayed TXO Commitments
Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
# Motivation

UTXO growth is a serious concern for Bitcoin's long-term decentralization. To
run a competitive mining operation potentially the entire UTXO set must be in
RAM to achieve competitive latency; your larger, more centralized, competitors
will have the UTXO set in RAM. Mining is a zero-sum game, so the extra latency
of not doing so if they do directly impacts your profit margin. Secondly,
having possession of the UTXO set is one of the minimum requirements to run a
full node; the larger the set the harder it is to run a full node.

Currently the maximum size of the UTXO set is unbounded as there is no
consensus rule that limits growth, other than the block-size limit itself; as
of writing the UTXO set is 1.3GB in the on-disk, compressed serialization,
which expands to significantly more in memory. UTXO growth is driven by a
number of factors, including the fact that there is little incentive to merge
inputs, lost coins, dust outputs that can't be economically spent, and
non-btc-value-transfer "blockchain" use-cases such as anti-replay oracles and
timestamping.

We don't have good tools to combat UTXO growth. Segregated Witness proposes to
give witness space a 75% discount, in part of make reducing the UTXO set size
by spending txouts cheaper. While this may change wallets to more often spend
dust, it's hard to imagine an incentive sufficiently strong to discourage most,
let alone all, UTXO growing behavior.

For example, timestamping applications often create unspendable outputs due to
ease of implementation, and because doing so is an easy way to make sure that
the data required to reconstruct the timestamp proof won't get lost - all
Bitcoin full nodes are forced to keep a copy of it. Similarly anti-replay
use-cases like using the UTXO set for key rotation piggyback on the uniquely
strong security and decentralization guarantee that Bitcoin provides; it's very
difficult - perhaps impossible - to provide these applications with
alternatives that are equally secure. These non-btc-value-transfer use-cases
can often afford to pay far higher fees per UTXO created than competing
btc-value-transfer use-cases; many users could afford to spend $50 to register
a new PGP key, yet would rather not spend $50 in fees to create a standard two
output transaction. Effective techniques to resist miner censorship exist, so
without resorting to whitelists blocking non-btc-value-transfer use-cases as
"spam" is not a long-term, incentive compatible, solution.

A hard upper limit on UTXO set size could create a more level playing field in
the form of fixed minimum requirements to run a performant Bitcoin node, and
make the issue of UTXO "spam" less important. However, making any coins
unspendable, regardless of age or value, is a politically untenable economic
change.


# TXO Commitments

A merkle tree committing to the state of all transaction outputs, both spent
and unspent, we can provide a method of compactly proving the current state of
an output. This lets us "archive" less frequently accessed parts of the UTXO
set, allowing full nodes to discard the associated data, still providing a
mechanism to spend those archived outputs by proving to those nodes that the
outputs are in fact unspent.

Specifically TXO commitments proposes a Merkle Mountain Range¹ (MMR), a
type of deterministic, indexable, insertion ordered merkle tree, which allows
new items to be cheaply appended to the tree with minimal storage requirements,
just log2(n) "mountain tips". Once an output is added to the TXO MMR it is
never removed; if an output is spent its status is updated in place. Both the
state of a specific item in the MMR, as well the validity of changes to items
in the MMR, can be proven with log2(n) sized proofs consisting of a merkle path
to the tip of the tree.

At an extreme, with TXO commitments we could even have no UTXO set at all,
entirely eliminating the UTXO growth problem. Transactions would simply be
accompanied by TXO commitment proofs showing that the outputs they wanted to
spend were still unspent; nodes could update the state of the TXO MMR purely
from TXO commitment proofs. However, the log2(n) bandwidth overhead per txin is
substantial, so a more realistic implementation is be to have a UTXO cache for
recent transactions, with TXO commitments acting as a alternate for the (rare)
event that an old txout needs to be spent.

Proofs can be generated and added to transactions without the involvement of
the signers, even after the fact; there's no need for the proof itself to
signed and the proof is not part of the transaction hash. Anyone with access to
TXO MMR data can (re)generate missing proofs, so minimal, if any, changes are
required to wallet software to make use of TXO commitments.


## Delayed Commitments

TXO commitments aren't a new idea - the author proposed them years ago in
response to UTXO commitments. However it's critical for small miners' orphan
rates that block validation be fast, and so far it has proven difficult to
create (U)TXO implementations with acceptable performance; updating and
recalculating cryptographicly hashed merkelized datasets is inherently more
work than not doing so. Fortunately if we maintain a UTXO set for recent
outputs, TXO commitments are only needed when spending old, archived, outputs.
We can take advantage of this by delaying the commitment, allowing it to be
calculated well in advance of it actually being used, thus changing a
latency-critical task into a much easier average throughput problem.

Concretely each block B_i commits to the TXO set state as of block B_{i-n}, in
other words what the TXO commitment would have been n blocks ago, if not for
the n block delay. Since that commitment only depends on the contents of the
blockchain up until block B_{i-n}, the contents of any block after are
irrelevant to the calculation.


## Implementation

Our proposed high-performance/low-latency delayed commitment full-node
implementation needs to store the following data:

1) UTXO set

Low-latency K:V map of txouts definitely known to be unspent. Similar to
existing UTXO implementation, but with the key difference that old,
unspent, outputs may be pruned from the UTXO set.


2) STXO set

Low-latency set of transaction outputs known to have been spent by
transactions after the most recent TXO commitment, but created prior to the
TXO commitment.


3) TXO journal

FIFO of outputs that need to be marked as spent in the TXO MMR. Appends
must be low-latency; removals can be high-latency.


4) TXO MMR list

Prunable, ordered list of TXO MMR's, mainly the highest pending commitment,
backed by a reference counted, cryptographically hashed object store
indexed by digest (similar to how git repos work). High-latency ok. We'll
cover this in more in detail later.


### Fast-Path: Verifying a Txout Spend In a Block

When a transaction output is spent by a transaction in a block we have two
cases:

1) Recently created output

Output created after the most recent TXO commitment, so it should be in the
UTXO set; the transaction spending it does not need a TXO commitment proof.
Remove the output from the UTXO set and append it to the TXO journal.

2) Archived output

Output created prior to the most recent TXO commitment, so there's no
guarantee it's in the UTXO set; transaction will have a TXO commitment
proof for the most recent TXO commitment showing that it was unspent.
Check that the output isn't already in the STXO set (double-spent), and if
not add it. Append the output and TXO commitment proof to the TXO journal.

In both cases recording an output as spent requires no more than two key:value
updates, and one journal append. The existing UTXO set requires one key:value
update per spend, so we can expect new block validation latency to be within 2x
of the status quo even in the worst case of 100% archived output spends.


### Slow-Path: Calculating Pending TXO Commitments

In a low-priority background task we flush the TXO journal, recording the
outputs spent by each block in the TXO MMR, and hashing MMR data to obtain the
TXO commitment digest. Additionally this background task removes STXO's that
have been recorded in TXO commitments, and prunes TXO commitment data no longer
needed.

Throughput for the TXO commitment calculation will be worse than the existing
UTXO only scheme. This impacts bulk verification, e.g. initial block download.
That said, TXO commitments provides other possible tradeoffs that can mitigate
impact of slower validation throughput, such as skipping validation of old
history, as well as fraud proof approaches.


### TXO MMR Implementation Details

Each TXO MMR state is a modification of the previous one with most information
shared, so we an space-efficiently store a large number of TXO commitments
states, where each state is a small delta of the previous state, by sharing
unchanged data between each state; cycles are impossible in merkelized data
structures, so simple reference counting is sufficient for garbage collection.
Data no longer needed can be pruned by dropping it from the database, and
unpruned by adding it again. Since everything is committed to via cryptographic
hash, we're guaranteed that regardless of where we get the data, after
unpruning we'll have the right data.

Let's look at how the TXO MMR works in detail. Consider the following TXO MMR
with two txouts, which we'll call state #0:

0
/ \
a b

If we add another entry we get state #1:

1
/ \
0 \
/ \ \
a b c

Note how it 100% of the state #0 data was reused in commitment #1. Let's
add two more entries to get state #2:

2
/ \
2 \
/ \ \
/ \ \
/ \ \
0 2 \
/ \ / \ \
a b c d e

This time part of state #1 wasn't reused - it's wasn't a perfect binary
tree - but we've still got a lot of re-use.

Now suppose state #2 is committed into the blockchain by the most recent block.
Future transactions attempting to spend outputs created as of state #2 are
obliged to prove that they are unspent; essentially they're forced to provide
part of the state #2 MMR data. This lets us prune that data, discarding it,
leaving us with only the bare minimum data we need to append new txouts to the
TXO MMR, the tips of the perfect binary trees ("mountains") within the MMR:

2
/ \
2 \
\
\
\
\
\
e

Note that we're glossing over some nuance here about exactly what data needs to
be kept; depending on the details of the implementation the only data we need
for nodes "2" and "e" may be their hash digest.

Adding another three more txouts results in state #3:

3
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 3
/ \
/ \
/ \
3 3
/ \ / \
e f g h

Suppose recently created txout f is spent. We have all the data required to
update the MMR, giving us state #4. It modifies two inner nodes and one leaf
node:

4
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 4
/ \
/ \
/ \
4 3
/ \ / \
e (f) g h

If an archived txout is spent requires the transaction to provide the merkle
path to the most recently committed TXO, in our case state #2. If txout b is
spent that means the transaction must provide the following data from state #2:

2
/
2
/
/
/
0
\
b

We can add that data to our local knowledge of the TXO MMR, unpruning part of
it:

4
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 4
/ / \
/ / \
/ / \
0 4 3
\ / \ / \
b e (f) g h

Remember, we haven't _modified_ state #4 yet; we just have more data about it.
When we mark txout b as spent we get state #5:

5
/ \
/ \
/ \
/ \
/ \
/ \
/ \
5 4
/ / \
/ / \
/ / \
5 4 3
\ / \ / \
(b) e (f) g h

Secondly by now state #3 has been committed into the chain, and transactions
that want to spend txouts created as of state #3 must provide a TXO proof
consisting of state #3 data. The leaf nodes for outputs g and h, and the inner
node above them, are part of state #3, so we prune them:

5
/ \
/ \
/ \
/ \
/ \
/ \
/ \
5 4
/ /
/ /
/ /
5 4
\ / \
(b) e (f)

Finally, lets put this all together, by spending txouts a, c, and g, and
creating three new txouts i, j, and k. State #3 was the most recently committed
state, so the transactions spending a and g are providing merkle paths up to
it. This includes part of the state #2 data:

3
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 3
/ \ \
/ \ \
/ \ \
0 2 3
/ / /
a c g

After unpruning we have the following data for state #5:

5
/ \
/ \
/ \
/ \
/ \
/ \
/ \
5 4
/ \ / \
/ \ / \
/ \ / \
5 2 4 3
/ \ / / \ /
a (b) c e (f) g

That's sufficient to mark the three outputs as spent and add the three new
txouts, resulting in state #6:

6
/ \
/ \
/ \
/ \
/ \
6 \
/ \ \
/ \ \
/ \ \
/ \ \
/ \ \
/ \ \
/ \ \
6 6 \
/ \ / \ \
/ \ / \ 6
/ \ / \ / \
6 6 4 6 6 \
/ \ / / \ / / \ \
(a) (b) (c) e (f) (g) i j k

Again, state #4 related data can be pruned. In addition, depending on how the
STXO set is implemented may also be able to prune data related to spent txouts
after that state, including inner nodes where all txouts under them have been
spent (more on pruning spent inner nodes later).


### Consensus and Pruning

It's important to note that pruning behavior is consensus critical: a full node
that is missing data due to pruning it too soon will fall out of consensus, and
a miner that fails to include a merkle proof that is required by the consensus
is creating an invalid block. At the same time many full nodes will have
significantly more data on hand than the bare minimum so they can help wallets
make transactions spending old coins; implementations should strongly consider
separating the data that is, and isn't, strictly required for consensus.

A reasonable approach for the low-level cryptography may be to actually treat
the two cases differently, with the TXO commitments committing too what data
does and does not need to be kept on hand by the UTXO expiration rules. On the
other hand, leaving that uncommitted allows for certain types of soft-forks
where the protocol is changed to require more data than it previously did.


### Consensus Critical Storage Overheads

Only the UTXO and STXO sets need to be kept on fast random access storage.
Since STXO set entries can only be created by spending a UTXO - and are smaller
than a UTXO entry - we can guarantee that the peak size of the UTXO and STXO
sets combined will always be less than the peak size of the UTXO set alone in
the existing UTXO-only scheme (though the combined size can be temporarily
higher than what the UTXO set size alone would be when large numbers of
archived txouts are spent).

TXO journal entries and unpruned entries in the TXO MMR have log2(n) maximum
overhead per entry: a unique merkle path to a TXO commitment (by "unique" we
mean that no other entry shares data with it). On a reasonably fast system the
TXO journal will be flushed quickly, converting it into TXO MMR data; the TXO
journal will never be more than a few blocks in size.

Transactions spending non-archived txouts are not required to provide any TXO
commitment data; we must have that data on hand in the form of one TXO MMR
entry per UTXO. Once spent however the TXO MMR leaf node associated with that
non-archived txout can be immediately pruned - it's no longer in the UTXO set
so any attempt to spend it will fail; the data is now immutable and we'll never
need it again. Inner nodes in the TXO MMR can also be pruned if all leafs under
them are fully spent; detecting this is easy the TXO MMR is a merkle-sum tree,
with each inner node committing to the sum of the unspent txouts under it.

When a archived txout is spent the transaction is required to provide a merkle
path to the most recent TXO commitment. As shown above that path is sufficient
information to unprune the necessary nodes in the TXO MMR and apply the spend
immediately, reducing this case to the TXO journal size question (non-consensus
critical overhead is a different question, which we'll address in the next
section).

Taking all this into account the only significant storage overhead of our TXO
commitments scheme when compared to the status quo is the log2(n) merkle path
overhead; as long as less than 1/log2(n) of the UTXO set is active,
non-archived, UTXO's we've come out ahead, even in the unrealistic case where
all storage available is equally fast. In the real world that isn't yet the
case - even SSD's significantly slower than RAM.


### Non-Consensus Critical Storage Overheads

Transactions spending archived txouts pose two challenges:

1) Obtaining up-to-date TXO commitment proofs

2) Updating those proofs as blocks are mined

The first challenge can be handled by specialized archival nodes, not unlike
how some nodes make transaction data available to wallets via bloom filters or
the Electrum protocol. There's a whole variety of options available, and the
the data can be easily sharded to scale horizontally; the data is
self-validating allowing horizontal scaling without trust.

While miners and relay nodes don't need to be concerned about the initial
commitment proof, updating that proof is another matter. If a node aggressively
prunes old versions of the TXO MMR as it calculates pending TXO commitments, it
won't have the data available to update the TXO commitment proof to be against
the next block, when that block is found; the child nodes of the TXO MMR tip
are guaranteed to have changed, yet aggressive pruning would have discarded that
data.

Relay nodes could ignore this problem if they simply accept the fact that
they'll only be able to fully relay the transaction once, when it is initially
broadcast, and won't be able to provide mempool functionality after the initial
relay. Modulo high-latency mixnets, this is probably acceptable; the author has
previously argued that relay nodes don't need a mempool² at all.

For a miner though not having the data necessary to update the proofs as blocks
are found means potentially losing out on transactions fees. So how much extra
data is necessary to make this a non-issue?

Since the TXO MMR is insertion ordered, spending a non-archived txout can only
invalidate the upper nodes in of the archived txout's TXO MMR proof (if this
isn't clear, imagine a two-level scheme, with a per-block TXO MMRs, committed
by a master MMR for all blocks). The maximum number of relevant inner nodes
changed is log2(n) per block, so if there are n non-archival blocks between the
most recent TXO commitment and the pending TXO MMR tip, we have to store
log2(n)*n inner nodes - on the order of a few dozen MB even when n is a
(seemingly ridiculously high) year worth of blocks.

Archived txout spends on the other hand can invalidate TXO MMR proofs at any
level - consider the case of two adjacent txouts being spent. To guarantee
success requires storing full proofs. However, they're limited by the blocksize
limit, and additionally are expected to be relatively uncommon. For example, if
1% of 1MB blocks was archival spends, our hypothetical year long TXO commitment
delay is only a few hundred MB of data with low-IO-performance requirements.


## Security Model

Of course, a TXO commitment delay of a year sounds ridiculous. Even the slowest
imaginable computer isn't going to need more than a few blocks of TXO
commitment delay to keep up ~100% of the time, and there's no reason why we
can't have the UTXO archive delay be significantly longer than the TXO
commitment delay.

However, as with UTXO commitments, TXO commitments raise issues with Bitcoin's
security model by allowing relatively miners to profitably mine transactions
without bothering to validate prior history. At the extreme, if there was no
commitment delay at all at the cost of a bit of some extra network bandwidth
"full" nodes could operate and even mine blocks completely statelessly by
expecting all transactions to include "proof" that their inputs are unspent; a
TXO commitment proof for a commitment you haven't verified isn't a proof that a
transaction output is unspent, it's a proof that some miners claimed the txout
was unspent.

At one extreme, we could simply implement TXO commitments in a "virtual"
fashion, without miners actually including the TXO commitment digest in their
blocks at all. Full nodes would be forced to compute the commitment from
scratch, in the same way they are forced to compute the UTXO state, or total
work. Of course a full node operator who doesn't want to verify old history can
get a copy of the TXO state from a trusted source - no different from how you
could get a copy of the UTXO set from a trusted source.

A more pragmatic approach is to accept that people will do that anyway, and
instead assume that sufficiently old blocks are valid. But how old is
"sufficiently old"? First of all, if your full node implementation comes "from
the factory" with a reasonably up-to-date minimum accepted total-work
thresholdⁱ - in other words it won't accept a chain with less than that amount
of total work - it may be reasonable to assume any Sybil attacker with
sufficient hashing power to make a forked chain meeting that threshold with,
say, six months worth of blocks has enough hashing power to threaten the main
chain as well.

That leaves public attempts to falsify TXO commitments, done out in the open by
the majority of hashing power. In this circumstance the "assumed valid"
threshold determines how long the attack would have to go on before full nodes
start accepting the invalid chain, or at least, newly installed/recently reset
full nodes. The minimum age that we can "assume valid" is tradeoff between
political/social/technical concerns; we probably want at least a few weeks to
guarantee the defenders a chance to organise themselves.

With this in mind, a longer-than-technically-necessary TXO commitment delayʲ
may help ensure that full node software actually validates some minimum number
of blocks out-of-the-box, without taking shortcuts. However this can be
achieved in a wide variety of ways, such as the author's prev-block-proof
proposal³, fraud proofs, or even a PoW with an inner loop dependent on
blockchain data. Like UTXO commitments, TXO commitments are also potentially
very useful in reducing the need for SPV wallet software to trust third parties
providing them with transaction data.

i) Checkpoints that reject any chain without a specific block are a more
common, if uglier, way of achieving this protection.

j) A good homework problem is to figure out how the TXO commitment could be
designed such that the delay could be reduced in a soft-fork.


## Further Work

While we've shown that TXO commitments certainly could be implemented without
increasing peak IO bandwidth/block validation latency significantly with the
delayed commitment approach, we're far from being certain that they should be
implemented this way (or at all).

1) Can a TXO commitment scheme be optimized sufficiently to be used directly
without a commitment delay? Obviously it'd be preferable to avoid all the above
complexity entirely.

2) Is it possible to use a metric other than age, e.g. priority? While this
complicates the pruning logic, it could use the UTXO set space more
efficiently, especially if your goal is to prioritise bitcoin value-transfer
over other uses (though if "normal" wallets nearly never need to use TXO
commitments proofs to spend outputs, the infrastructure to actually do this may
rot).

3) Should UTXO archiving be based on a fixed size UTXO set, rather than an
age/priority/etc. threshold?

4) By fixing the problem (or possibly just "fixing" the problem) are we
encouraging/legitimising blockchain use-cases other than BTC value transfer?
Should we?

5) Instead of TXO commitment proofs counting towards the blocksize limit, can
we use a different miner fairness/decentralization metric/incentive? For
instance it might be reasonable for the TXO commitment proof size to be
discounted, or ignored entirely, if a proof-of-propagation scheme (e.g.
thinblocks) is used to ensure all miners have received the proof in advance.

6) How does this interact with fraud proofs? Obviously furthering dependency on
non-cryptographically-committed STXO/UTXO databases is incompatible with the
modularized validation approach to implementing fraud proofs.


# References

1) "Merkle Mountain Ranges",
Peter Todd, OpenTimestamps, Mar 18 2013,
https://github.com/opentimestamps/opentimestamps-serveblob/mastedoc/merkle-mountain-range.md

2) "Do we really need a mempool? (for relay nodes)",
Peter Todd, bitcoin-dev mailing list, Jul 18th 2015,
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-July/009479.html

3) "Segregated witnesses and validationless mining",
Peter Todd, bitcoin-dev mailing list, Dec 23rd 2015,
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Decembe012103.html

--
https://petertodd.org 'peter'[:-1]@petertodd.org
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 455 bytes
Desc: Digital signature
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20160517/33f69665/attachment-0001.sig>
submitted by Godballz to CryptoTechnology [link] [comments]

Satoshi's unmoved coins are the world's biggest prize in quantum-decryption, the canary in bitcoin's quantum coalmine

/Nullc recently explained that in the early years, mined bitcoin was paid to the pubkey, not the pubkey-hash.
I was used to the idea that any address that hadn't been spent from was considered quantum-safe. But this isn't true for any coins that were mined and not moved prior to 2012.
What this means is that all of Satoshi's coins are theoretically stealable by anyone who can pull off a successful quantum attack on bitcoin.
In fact, we must now consider them the canary in bitcoin's quantum coalmine because they will likely be the first to fall.
Anyone who can pull off a successful quantum attack on these early unmoved coins will make over $500 million dollars. Today.
Everyone will think Satoshi is moving his coins, but in fact it will more likely be a quantum attacker, and that is a shame, unless Satoshi himself wizes up and acts soon.
Beyond that, a successful quantum attack may allow someone to masquerade as Satoshi by giving them the private key to these original coins.
The day is quickly approaching where even if someone were to sign a message using Satoshi's known coin hoard addresses, we should think twice about whether this person actually is Satoshi or not, since it may not be long before a successful quantum attack will make his early addresses vulnerable to exposure.
Now this vulnerability changed in 2012, so current mining to an unspent address is, thankfully, safe. And if you have an address with coins in it that has never been spent from, you are also quantum safe.
I just fear we are in for more Satoshi-hoaxing and drama due to these old addresses. And if Satoshi's coins ever move, we should consider it likely that the quantum nut has finally been cracked by someone and we'll need to be more careful about address reuse.
There may be one other issue. There may be a lot of 2012 mined coined that has never been spent. Right now we consider much of this coin to be simply lost.
But in the near future, quantum cryptographers may be able to recover much of this coin and make perhaps another $500 million or so.
A billion dollar prize for the quantum researchers out there. Not a bad plum if you ask me.
submitted by Anenome5 to Bitcoin [link] [comments]

The Problems with Segregated Witness

MORE: https://medium.com/the-publius-letters/segregated-witness-a-fork-too-far-87d6e57a4179
... 3. The Problems with Segregated Witness
While it is true that Segregated Witness offers some improvements to the Bitcoin network, we shall now examine why these benefits are not nearly enough to outweigh the dangers of deploying SW as a soft fork.
3.1 SW creates a financial incentive for bloating witness data
SW allows for a theoretical maximum block size limit of ~4 MB. However, this is only true if the entire block was occupied with transactions of a very small ‘base size’ (e.g. P2WPKH with 1 input, 1 output). In practice, based on the average transaction size today and the types of transactions made, the block size limit is expected to have a maximum limit of ~1.7 MB post-SW (Figure 10; assuming all transactions are using SW unspent outputs — a big assumption).
However, the 4 MB theoretical limit creates a key problem. Miners and full node operators need to ensure that their systems can handle the 4 MB limit, even though at best they will only be able to support ~40% of that transaction capacity. Why? Because there exists a financial incentive for malicious actors to design transactions with a small base size but large and complex witness data. This is exacerbated by the fact that witness scripts (i.e. P2SH-P2WSH or P2SH-P2WSH) will have higher script size limits that normal P2SH redeem scripts (i.e., from 520 bytes to 3,600 bytes [policy] or 10,000 bytes [consensus]). These potential problems only worsen as the block size limit is raised in the future, for example a 2 MB maximum base size creates an 8 MB adversarial case. This problem hinders scalability and makes future capacity increases more difficult.
3.2 SW fails to sufficiently address the problems it intends to solve
If SW is activated by soft fork, Bitcoin will effectively have two classes of UTXOs (non-SW vs SW UTXOs), each with different security and economic properties. Linear signature hashing and malleability fixes will only be available to the SW UTXO. Most seriously, there are no enforceable constraints to the growth of the non-SW UTXO. This means that the network (even upgraded nodes) are still vulnerable to transaction malleability and quadratic signature hashing from non-SW outputs that existed before or created after the soft fork.
The lack of enforceability that comes with a soft fork leaves Bitcoin users and developers vulnerable to precisely the type of attacks SW is designed to prevent. While spending non-SW outputs will be comparatively more expensive than SW outputs, this remains a relatively weak disincentive for a motivated attacker.
It is also unclear what proportion of the total number of the legacy UTXO will migrate to SW outputs. Long-term holders of Bitcoin, such as Satoshi Nakamoto (presumed to be in possession of ~1 million Bitcoin), may keep their coins in non-SW outputs (although it would be a significant vote of confidence in SW by Nakamoto if they were to migrate!). This makes future soft or hard forks to Bitcoin more difficult as multiple classes of UTXOs must now be supported to prevent coins from being burned or stolen.
One key concern is that the coexistence of two UTXO types may tempt developers and miners in the future to destroy the non-SW UTXO. Some may view this as an unfounded concern, but the only reason that this is worth mentioning in this article are the comments made by influential individuals associated with Bitcoin Core: Greg Maxwell has postulated that “abandoned UTXO should be forgotten and become unspendable,” and Theymos has claimed “the very-rough consensus is that old coins should be destroyed before they are stolen to prevent disastrous monetary inflation.”
As the security properties of SW outputs are marginally better than non-SW outputs, it may serve as a sufficient rationalization for this type of punitive action.
The existence of two UTXO types with different security and economic properties also deteriorates Bitcoin’s fungibility. Miners and fully validating nodes may decide not to relay, or include in blocks, transactions that spend to one type or the other. While on one hand this is a positive step towards enforceability (i.e. soft enforceability), it is detrimental to unsophisticated Bitcoin users who have funds in old or non-upgraded wallets. Furthermore, it is completely reasonable for projects such as the lightning network to reject forming bidirectional payment channels (i.e. a multisignature P2SH address) using non-SW P2SH outputs due to the possibility of malleability. Fundamentally this means that the face-value of Bitcoin will not be economically treated the same way depending on the type of output it comes from.
It is widely understood in software development that measures which rely on the assumption of users changing their behavior to adopt better security practices are fundamentally doomed to fail; more so when the unpatched vulnerabilities are permitted to persist and grow. An example familiar to most readers would be the introduction and subsequent snail’s pace uptake of HTTPS.
3.3 SW places complex requirements on developers to comply while failing to guarantee any benefits
SW as a soft fork brings with it a mountain of irreversible technical debt, with multiple opportunities for developers to permanently cause the loss of user funds. For example, the creation of P2SH-P2WPKH or P2SH-P2WSH addresses requires the strict use of compressed pubkeys, otherwise funds can be irrevocably lost. Similarly, the use of OP_IF, OP_NOTIF, OP_CHECKSIG, and OP_CHECKMULTISIG must be carefully handled for SW transactions in order to prevent the loss of funds. It is all but certain that some future developers will cause user loss of funds due to an incomplete understanding of the intricacies of SW transaction formats.
In terms of priorities, SW is not a solution to any of the major support ticket issues that are received daily by Bitcoin businesses such as BitPay, Coinbase, Blockchain.info, etc. The activation of SW will not increase the transaction capacity of Bitcoin overnight, but only incrementally as a greater percentage of transactions spend to SW outputs. Moreover, the growing demand for on-chain transactions may very well exceed the one-off capacity increase as demonstrated by the increasing frequency of transaction backlogs.
In contrast to a basic block size increase (BBSI) from a coordinated hard fork, many wallets and SPV clients will immediately benefit from new capacity increases without the need to rewrite their own software as they must do with SW.. With a BBSI, unlike SW, there are no transaction format or signature changes required on the part of Bitcoin-using applications.
Based on previous experience with soft forks in Bitcoin, upgrades tend to roll-out within the ecosystem over some time. At the time of this writing, only 28 out of the 78 business and projects (36%) who have publicly committed to the upgrade are SW-compatible. Any capacity increase that Bitcoin businesses and users of the network desire to ease on-chain fee pressure will unlikely be felt for some time, assuming that transaction volume remains unchanged and does not continue growing. The unpredictability of this capacity increase and the growth of the non-SW UTXO are particularly troubling for Bitcoin businesses from the perspectives of user-growth and security, respectively. Conversely, a BBSI delivers an immediate and predictable capacity increase.
The voluntary nature of SW upgrades is subject to the first-mover game theory problem. With a risky upgrade that moves transaction signatures to a new witness field that is hidden to some nodes, the incentive for the rational actor is to let others take that risk first, while the rational actor sits back, waits, and watches to see if people lose funds or have problems. Moreover, the voluntary SW upgrade also suffers from the free-rider game theory problem. If others upgrade and move their data to the witness field, one can benefit even without upgrading or using SW transactions themselves. These factors further contribute to the unpredictable changes to Bitcoin’s transaction capacity and fees if SW is adopted via a soft fork.
3.4 Economic distortions and price fixing
Segregated Witness as a soft fork alters the economic incentives that regulate access to Bitcoin’s one fundamental good: block-size space. Firstly, it subsidises signature data in large/complex P2WSH transactions (i.e., at ¼ of the cost of transaction/UTXO data). However, the signatures are more expensive to validate than the UTXO, which makes this unjustifiable in terms of computational cost. The discount itself appears to have been determined arbitrarily and not for any scientific or data-backed reasoning.
Secondly, the centralized and top-down planning of one of Bitcoin’s primary economic resources, block space, further disintermediates various market forces from operating without friction. SW as a soft fork is designed to preserve the 1 MB capacity limit for on-chain transactions, which will purposely drive on-chain fees up for all users of Bitcoin. Rising transaction fees, euphemistically called a ‘fee market’, is anything but a market when one side — i.e. supply — is fixed by central economic planners (the developers) who do not pay the costs for Bitcoin’s capacity (the miners). Economic history has long taught us the results of non-market intervention in the supply of goods and services: the costs are externalised to consumers. The adoption of SW as a soft fork creates a bad precedent for further protocol changes that affirm this type of economic planning.
3.5 Soft fork risks
In this section we levy criticisms of soft forks more broadly when they affect the protocol and economic properties of Bitcoin to the extent that SW does. In this case, a soft fork reduces the security of full nodes without the consent of the node operator. The SW soft fork forces node operators either to upgrade, or to unconditionally accept the loss of security by being downgraded to a SPV node.
Non-upgraded nodes further weaken the general security of Bitcoin as a whole through the reduction of the number of fully validating nodes on the network. This is because non-upgraded nodes will only perform the initial check to see if the redeem script hash matches the pubkey script hash of the unspent output. This redeem script may contain an invalid witness program, for P2WSH transactions, that the non-upgraded node doesn’t know how to verify. This node will then blindly relay the invalid transaction across the network.
SW as a soft fork is the opposite of anti-fragile. Even if the community wants the change (i.e., an increase in transaction capacity), soft-forking to achieve these changes means that the miners become the key target of lobbying (and they already are). Soft forks are risky in this context because it becomes relatively easy to change things, which may be desirable if the feature is both minor and widely beneficial. However, it is bad in this case because the users of Bitcoin (i.e. everyone else but the miners) are not given the opportunity to consent or opt-out, despite being affected the most by such a sweeping change. This can be likened to a popular head of state who bends the rules of jurisprudence to bypass slow legal processes to “get things done.” The dangerous precedent of taking legal shortcuts is not of concern the masses until a new, less popular leader takes hold of the reigns, and by then it is too late to reverse. In contrast, activating SW via a hard fork ensures that the entire community, not just the miners, decide on changes made to the protocol. Users who unequivocally disagree with a change being made are given the clear option not to adopt the change — not so with a soft fork.
3.6 Once activated, SW cannot be undone and must remain in Bitcoin codebase forever.
If any critical bugs resulting from SW are discovered down the road, bugs serious enough to contemplate rolling it back, then anyone will be able to spend native SW outputs, leading to a catastrophic loss of funds. ...
...
Conclusion
Segregated Witness is the most radical and irresponsible protocol upgrade Bitcoin has faced in its eight year history. The push for the SW soft fork puts Bitcoin miners in a difficult and unfair position to the extent that they are pressured into enforcing a complicated and contentious change to the Bitcoin protocol, without community consensus or an honest discussion weighing the benefits against the costs. The scale of the code changes are far from trivial — nearly every part of the codebase is affected by SW.
While increasing the transaction capacity of Bitcoin has already been significantly delayed, SW represents an unprofessional and ineffective solution to both transaction malleability and scaling. As a soft fork, SW introduces more technical debt to the protocol and fundamentally fails to achieve its design purpose. As a hard fork, combined with real on-chain scaling, SW can effectively mitigate transaction malleability and quadratic signature hashing. Each of these issues are too important for the future of Bitcoin to gamble on SW as a soft fork and the permanent baggage that comes with it.
As much as the authors of this article desire transaction capacity increases, it is far better to work towards a clean technical solution to malleability and scaling than to further encumber the Bitcoin protocol with permanent technical debt. ...
MORE: https://medium.com/the-publius-letters/segregated-witness-a-fork-too-far-87d6e57a4179
submitted by german_bitcoiner to btc [link] [comments]

Roadmap Updates - Thursday, January 4th

CO-WORKING SPACE, HONG KONG
MEETUP FOR DEVELOPERS AND ENTERPRISES, TOKYO
PARTNERSHIP WITH CHINACCELERATOR AND MOX
INVESTMENT IN A BLOCKCHAIN PROJECT, SHANGHAI
R&D CENTER, VIETNAM
DISTRIBUTED FUTURES RESEARCH
LEDGER WALLET
DEBIT CARDS
CARDANO SOFTWARE AUDIT
SIDECHAINS (5% Completion)
ACCOUNTING MODEL (10% Completion)
PLUTUS CORE (12% Completion)
IELE VIRTUAL MACHINE V1 (33% Completion)
INTEGRATION AND IMPLEMENTATION (15% Completion)
SMART CONTRACTS DEPLOYMENT & INTERACTION
submitted by Vascular_D to CardanoCoin [link] [comments]

Possibilities of XRP Reaching $100

Given the current amount of XRPs circulating the market (38,739,144,847 XRPs according to CoinMarketCap), the price of XRP reaching $100 is, I don't want to say impossible because God knows that word becomes more and more meaningless as time goes on, so I'll say very skeptical. In theory, if it ever did by some miraculous chain of events reach $100, the market cap would be trillions of dollars, around 10x - 15x more than bitcoin.
But that does not mean that XRP can't reach $100, and I believe that it'll happen soon, maybe sometime next year, because let us not forget about Ripple's escrow. They have an article where they say, and I quote:
"We’ll then return whatever is unused at the end of each month to the back of the escrow rotation. For example, if 500M XRP remain unspent at the end of the first month, those 500M XRP will be placed into a new escrow account set to expire in month 55. For comparison, Ripple has sold on average 300M XRP per month for the past 18 months."
So on average, they sell 300M XRPs per month, and there's currently around (38,739,144,847 XRPS circulating), so what do you guys think will happen at the end of the month when all those unsold XRPs are added back into an escrow?
Of course, the circulating amount will decrease, and XRP reaching $100, would not be so impossible.
These are my thoughts and theories, and should be taken with a bit of skepticism because I myself am a bit skeptical, but also interested in the data that I'm seeing, I don't want to let my skepticism cause me the lost of a great opportunity to bank on XRP, so I'm definitely investing more.
[Invest at your own discretion]
[EDIT] Here's the link to that article
https://ripple.com/insights/ripple-escrows-55-billion-xrp-for-supply-predictability/
submitted by bullishmaniac to Ripple [link] [comments]

How Much BTC (And retroactively, BCH) has moved from 2009 to 2010?

I was going to post this in /bitcoin but I have a feeling it would just get deleted. I’m curious if there is some kind of reporting/method that shows the following
I think this will tell us a few things;
submitted by OneSmallStepForLambo to btc [link] [comments]

How To Use Coin Control & Stake More With 'PoS' DEAL Coins

How To Use Coin Control & Stake More With 'PoS' DEAL Coins
When you are using Proof of Stake (PoS) coins, active coin control is essential to ensuring that you maximize your PoS rewards. If you do not use coin control you may benefit less from the compounding of your rewards, and you may lose more rewards than necessary when you make a payment from your wallet.
Exactly how much difference this makes will depend on the specifications of the coin itself, and what transactions you make. It is particularly important for coins such as IdealCash, in which small and medium-sized holders must wait a long time between each time they stake and get a reward.
This presents a manual method of coin control for users of Qt wallets!
Why You Need to Use Coin Control
As I already stated, small and medium-sized IdealCash holders must wait a long time before they stake. When you do finally get to stake some coins, you do not get the interest due on your whole balance; instead, one of the payments you have received is selected and you earn the rewards due on that amount. The rest of your interest is not lost, it keeps building up and you get it later. But what this means is that if you receive a single payment for 10,000 coins, for example, you will earn more compound interest than if you receive 10 payments for 1,000 coins, even though your balance is the same. This is because in both cases you will wait the same amount of time before getting your first reward, but in the first instance you will receive all of your interest at once, whereas in the second example you will receive only 10% as much on your first stake. Because you can earn PoS rewards on your PoS rewards (compounding) that mean you will eventually end up with more money in the first example when you got a single payment than you would in the second example where you got 10 payments that added up to the same amount.
In the example above you could easily fix this without getting into any advanced techniques simply by using your wallet in the ordinary way to send yourself 10,000 coins. By doing this you would combine all of the outputs – all 10 of the 1,000 IdealCash payments – into a single output. But sometimes things can be a little more complicated. For example, let's say you receive a single payment of 10,000 coins, and leave it in your wallet to stake. After 8 weeks you still haven’t earned your first reward, but have built up a nice bit of ‘interest owed’ on these coins. You now receive those 10 payments for 1,000 each we looked at before. When you spend any coins in your wallet, you lose the unpaid rewards owed to you for staking them. So in this example, you would want to send a 10,000 IdealCash payment to yourself using these 10 new payments. But if you simply use the wallet as normal to send a payment to yourself it will probably not do this – it will use the 10,000 you got in your first deposit, destroying your earned rewards and doing nothing to consolidate the smaller payments. To do this, you need to be able to specify which outputs (which payments received) you want to use to create your transaction; that is what I am going to show you how to accomplish in this guide.
Coin control can also be useful when sending payments from your wallet. By specifying which outputs to use in order to make a payment you can ensure that you are not losing rewards unnecessarily. Please be aware, however, that when sending payments to somebody else you will probably need to send yourself change because each output is used entirely in each transaction. I will explain this properly later on in the guide, but for now just be aware that if you don’t do this you may lose some of your coins.
Manual Coin Control Using the Qt Wallet Console
Manual coin control requires you to use the console rather than the GUI. That means instead of just selecting options from menus, you have to manually type commands. These commands must be precise and use the correct grammar or they won’t work, so please read the instructions carefully and check that you have typed each command correctly before hitting enter.
To open a console window you just need to click ‘Help’ from the navigation menu of your wallet, then select ‘Debug Window’. This will open up a new window with two tabs: click ‘console’ to get to the tab that we are going to use here.
Step One: List Unspent Outputs In your console window type “listunspent” (without the quotation marks) and hit enter. This will give you a list of all the outputs of previous transactions which belong to you, or in more plain language, all of the payments that you have received.
Each one will appear between curly brackets and should look something like this:
{
"txid" : "a long list of random looking characters should be here",
"vout" : "this should be a number",
"address" : "this is the IdealCash address belonging to you that these coins where sent to",
"account" : "",
"scriptPubKey" : "another long list of random looking characters",
"amount" : "the number of coins you received",
"confirmations" : "the number of confirmations for this transaction"
}
You can use this list to select the outputs you want to use to make a payment. For example, IdealCash owners may want to select smaller and newer outputs (newer outputs will have fewer confirmations) and leave any older and larger ones.
When you have chosen which ones you want to use, make a note of the exact “amount” for each one. If you have received coins with more than one address then you will also need a list of each address that appears in one of the outputs you are going to use.
Once you have made a note of these things separately, we only need the “txid”, “vout” and “scriptPubKey” to put into the next step. You may like to copy and paste each one into a text editor and then delete every line apart from those three, to leave something like this:
{
"txid" : "a long list of random looking characters should be here",
"vout" : "this should be a number",
"scriptPubKey" : "another long list of random looking characters"
}
Notice that I have removed the comma from the end of the “scriptPubKey” line. Commas must always be included at the end of each line, apart from the last one. This is one of those annoying grammar things that you have to get right (and which I personally mess up more often than not). Don’t worry too much, if you get this kind of thing wrong you won’t lose any coins, it just won’t accept the command. Also please note that if you are not a programmer then you officially will be after you’ve done this, so feel free to be a bit proud of yourself.
Creating a Raw Transaction
The next step is to use the outputs we have selected to build the code for a complete transaction.
You should have a list of edited outputs from the last section. Put them, complete with brackets, into the command below:
createrawtransaction '[output1, output2]' '{"address you want to send coins to" : number of coins to send}' The number of coins you are sending must be exactly equal to total of all the outputs you have used minus the transaction fee (0.01). It is very important that you get this right as you could lose coins if you make a mistake here. For example, if you have one output with an amount of 1,000 and another with an amount of 2,000 then you must send 2999.99 coins (1000+2000-0.01). If you were to make a mistake and send just 2,000 coins, then you would end up paying a fee of 1,000 coins that you would never see again.
To give you an idea of what this should look you can see an example of a raw bitcoin transaction taken from this Reddit thread below:
createrawtransaction '[{"txid":"c7e5e03d2ab5458819eedec46d0ba38ca7a6525e38b493073277cf4a1550a348","vout":1,"scriptPubKey":"76a9144a06df74729aef1dce5e4641960da3a439d2460b88ac"},
{"txid":"c7e5e03d2ab5458819eedec46d0ba38ca7a6525e38b493073277cf4a1550a348","vout":0,"scriptPubKey":"76a914f88262828f5e64b454249e4c45ddb6071a2ab0a988ac"}]'
'{"Dty3qVvYyggQJnYfaXupbHRAPVhUxr1vso":0.17543376,"DwzkF4gP1xRoFKA4YvJpNFR9qvYzaWrFaE":0.17543375}'
IdealCash and most other altcoins will use the exact same format. In this example you can hopefully see that two addresses are listed to send coins to, with an amount for each one and a comma between them. If you are manually creating a transaction to send coins to somebody else then you will probably need to do this, because you will probably need to select outputs which add up to more than the amount you want to send, and then send the change to yourself. For example, you may have outputs of 10 and 15, but you want to send 20 coins – so you need to send the remaining 5 to your own address, otherwise they will be added to the transaction fee and lost.
When you have pasted this command into the console and hit enter, if you have managed to get all the grammar right, then you will get a long list of random looking characters as a response. This is your ‘raw transaction hex code’. Copy this for use in the next step.
Signing Your Transaction
You now need to sign your transaction with the private keys of all addresses used in the outputs you chose earlier. You can find your private key for any address using the following console command:
dumpprivkey
Obviously, you should replace
with the public address you want to get the private key for. If you have encrypted your wallet then you will need to unlock it before doing this.
You can now sign your raw transaction using the following command:
signrawtransaction 'your hex code generated in the last step' '[output1, output2]' '["privatekeyone", "privatekeytwoifneeded"]'
The middle section with output1 and output2 should be exactly the same as you used in the last step when you created the raw transaction.
When you hit the enter button on that command you will get a new, signed hex code. Copy and paste that into the final command:
Sending Your Transaction
The last bit is easy:
sendrawtransaction 'your signed hex code' When you hit enter on that final command your transaction will be broadcast to the network and your console will reply with a transaction id. If you have used this to send coins to yourself in order to consolidate smaller payments for staking, then your wallet GUI should now show ‘payment to yourself -0.01’.
submitted by Tyler__Z to IdealCash [link] [comments]

[ELI5] What IS a Dogecoin?

OK, this is something I wrote in response to several questions over on /BitcoinBeginners (which, BTW, is a significantly more active version of our own /Dogeducation and worth a look). Most recently in https://www.reddit.com/BitcoinBeginners/comments/5lw3w1/what_actually_is_a_bitcoin/
It is equally valid for other cryptos, including Dogecoin.
Lets cover some basics here:
So, what does this tell you?
  1. You MUST have your keys, and ONLY you should have them. Preferably human-readable.
  2. Keys can be stored anywhere. You could engrave them onto a pebble and hide it in the garden if you wished.
  3. Multiple copies are a no-brainer. Backup, backup, backup.
  4. If someone else gains your keys, they can spend your coins.
  5. Third party wallets which do NOT give you your keys are NOT your wallets!
  6. There are services which you can run, online or from a local copy, to generate wallets, create, sign and broadcast transactions, and track wallets.
  7. Given the above, you don't actually NEED clients or third-party wallet services.
  8. The cheapest wallet is just a line of text. For example, here's one I just generated:

1Px9Yj7uRJhpVPXmJ8ieQqT4vrP5Y5tz4S (public bitcoin key)

Kxej8kNXV9NBh7mCEMtp7xtFEEniUrzo5rsqHjXrTUn8ieyTj3SP (private key)

You could save that in any form you wish. Write it in a text file. Take a picture of it. Paint it on your wall. Chisel it into a brick. Engrave it inside a diamond. Or, load it into a software client or (expensive) hardware device which advertises what it is, and rely on easily-forgotten passphrases, 2FA, etc. Not to mention the many people who have accidentally deleted, reformatted or even thrown millions of dollars worth of coins out in the garbage.
Remember, YOUR coins are YOUR responsibility. You wouldn't leave a bundle of $100 notes sitting in the street, don't do the same with your crypto. ;)
submitted by Fulvio55 to dogecoin [link] [comments]

Secure paper wallet tutorial

This is my handout for paranoid people who want a way to store bitcoin safely. It requires a little work, but this is the method I use because it should be resistant to risks associated with:
  1. Bad random number generators
  2. Malicious or flawed software
  3. Hacked computers
If you want a method that is less secure but easier, skip to the bottom of this post.
The Secure Method
  1. Download bitaddress.org. (Try going to the website and pressing "ctrl+s")
  2. Put the bitaddress.org file on a computer with an operating system that has not interacted with the internet much or at all. The computer should not be hooked up to the internet when you do this. You could put the bitaddress file on a USB stick, and then turn off your computer, unplug the internet, and boot it up using a boot-from-CD copy of linux (Ubuntu or Mint for example). This prevents any mal-ware you may have accumulated from running and capturing your keystrokes. I use an old android smart phone that I have done a factory reset on. It has no sim-card and does not have the password to my home wifi. Also the phone wifi is turned off. If you are using a fresh operating system, and do not have a connection to the internet, then your private key will probably not escape the computer.
  3. Roll a die 62 times and write down the sequence of numbers. This gives you 2160 possible outcomes, which is the maximum that Bitcoin supports.
  4. Run bitaddress.org from your offline computer. Input the sequence of numbers from the die rolls into the "Brain Wallet" tab. By providing your own source of randomness, you do not have to worry that the random number generator used by your computer is too weak. I'm looking at you, NSA ಠ_ಠ
  5. Brain Wallet tab creates a private key and address.
  6. Write down the address and private key by hand or print them on a dumb printer. (Dumb printer means not the one at your office with the hard drive. Maybe not the 4 in 1 printer that scans and faxes and makes waffles.) If you hand copy them you may want to hand copy more than one format. (WIF and HEX). If you are crazy and are storing your life savings in Bitcoin, and you hand copy the private key, do a double-check by typing the private key back into the tool on the "Wallet Details" tab and confirm that it recreates the same public address.
  7. Load your paper wallet by sending your bitcoin to the public address. You can do this as many times as you like.
  8. You can view the current balance of your paper wallet by typing the public address into the search box at blockchain.info
  9. If you are using an old cell phone or tablet do a factory reset when you are finished so that the memory of the private keys is destroyed. If you are using a computer with a boot-from-CD copy of linux, I think you can just power down the computer and the private keys will be gone. (Maybe someone can confirm for me that the private keys would not be able to be cached by bitaddress?)
  10. To spend your paper wallet, you will need to either create an offline transaction, or import the private key into a hot wallet. Creating an offline transaction is dangerous if you don't know what you are doing. Importing to a client side wallet like Bitcoin-Qt, Electrum, MultiBit or Armory is a good idea. You can also import to an online wallet such as Blockchain.info or Coinbase.
Trusting bitaddress.org
The only thing you need bitaddress.org to do is to honestly convert the brainwallet passphrase into the corresponding private key and address. You can verify that it is doing this honestly by running several test passphrases through the copy of bitaddress that you plan on using, and several other brainwallet generators. For example, you could use the online version of bitaddress, and brainwallet and safepaperwallet and bitcoinpaperwallet. If you are fancy with the linux command line, you can also try "echo -n my_die_rolls | sha256sum". The linux operating system should reply with the same private key that bitaddress makes. This protects you from a malicious paper wallet generator.
Trusting your copy of bitaddress.org
Bitaddress publishes the sha1 hash of the bitaddress.org website at this location:
https://www.bitaddress.org/pgpsignedmsg.txt
The message is signed by the creator, pointbiz. I found his PGP fingerprint here:
https://github.com/pointbiz/bitaddress.org/issues/18
"527B 5C82 B1F6 B2DB 72A0 ECBF 8749 7B91 6397 4F5A"
With this fingerprint, you can authenticate the signed message, which gives you the hash of the current bitaddress.org file. Then you can hash your copy of the file and authenticate the file.
I do not have a way to authenticate the fingerprint itself, sorry. According to the website I linked to, git has cryptographic traceability that would enable a person to do some research and authenticate the fingerprint. If you want to go that far, knock yourself out. I think that the techniques described in this document do not really rely on bitaddress being un-corrupt. Anyway, how do we know pointbiz is a good guy? ;-)
There are a lot of skilled eyes watching bitaddress.org and the signed sha1 hash. To gain the most benefit from all of those eyes, it's probably worthwhile to check your copy by hashing it and comparing to the published hash.
"But we aren't supposed to use brainwallets"
You are not supposed to use brainwallets that have predictable passphrases. People think they are pretty clever about how they pick their passphrases, but a lot of bitcoins have been stolen because people tend to come up with similar ideas. If you let dice generate the passphrase, then it is totally random, and you just need to make sure to roll enough times.
How to avoid spending your life rolling dice
When I first started doing this, I rolled a die 62 times for each private key. This is not necessary. You can simply roll the die 62 times and keep the sequence of 62 numbers as a "seed". The first paper address you create would use "my die rolls-1" as the passphrase, the second would be "my die rolls-2" and so on. This is safe because SHA256 prevents any computable relationship between the resulting private key family.
Of course this has a certain bad security scenario -- if anyone obtains the seed they can reconstruct all of your paper wallets. So this is not for everyone! On the other hand, it also means that if you happen to lose one of your paper wallets, you could reconstruct it so long as you still had the seed.
One way to reduce this risk is to add an easy to remember password like this: "my die rolls-password-1".
If you prefer, you can use a technique called diceware to convert your die rolls to words that still contain the same quantity of entropy, but which could be easier to work with. I don't use diceware because it's another piece of software that I have to trust, and I'm just copy/pasting my high entropy seed, so I don't care about how ugly it is.
Why not input the dice as a Base 6 private key on the Wallet Details tab?
Two reasons. First of all, this option requires that you roll the die 99 times, but you do not get meaningful additional protection by rolling more than 62 times. Why roll more times if you don't have to? Second, I use the "high entropy seed" method to generate multiple private keys from the same die rolls. Using the Base 6 option would require rolling 99 times for every private key.
I'm a big nerd with exotic dice. How many times to roll?
Put this formula in Excel to get the number of times to roll: "=160*LOG(2,f)" where f = number of faces on the die. For example, you would roll a d16 40 times. By the way, somewhat unbelievably casino dice are more fair than ordinary dice
The "Change address" problem:
You should understand change addresses because some people have accidentally lost money by not understanding it.
Imagine your paper wallet is a 10 dollar bill. You use it to buy a candy bar. To do this you give the cashier the entire 10 dollar bill. They keep 1 dollar and give you 9 dollars back as change.
With Bitcoin, you have to explicitly say that you want 9 dollars back, and you have to provide an address where it should go to. If you just hand over the 10 dollar bill, and don't say you want 9 dollars back, then the miner who processes the transaction gives 1 dollar to the store and keeps the remainder themselves.
Wallet software like Bitcoin-Qt handles this automatically for you. They automatically make "change addresses" and they automatically construct transactions that make the change go to the change address.
There are three ways I know of that the change problem can bite you:
  1. You generate a raw transaction by hand, and screw up. If you are generating a transaction "by hand" with a raw transaction editor, you need to be extra careful that your outputs add up to the same number as your inputs. Otherwise, the very lucky miner who puts your transaction in a block will keep the difference.
  2. You import a paper wallet into a wallet software and spend part of it, and then think that the change is in the paper wallet. The change is not in the paper wallet. It is in a change address that the wallet software generated. That means that if you lose your wallet.dat file you will lose all the change. The paper wallet is empty.
  3. You import a paper wallet into a wallet software and spend part of it, and then think that the change is in the change address that the wallet software generated. If the transaction did not need to consume all of the "outputs" used to fund the paper wallet, then there could be some unspent outputs still located at the address of the paper wallet. If you destroyed the paper wallet, and destroyed the copy of the private key imported to the wallet software, then you could not access this money. (E.g. if you restored the software wallet from its seed, thinking all of the money was moved to the wallet-generated change addresses.)
For more on this, see here
The hot paper wallet problem
Your bitcoin in your paper wallet are secure, so long as the piece of paper is secure, until you go to spend it. When you spend it, you put the private key onto a computer that is connected to the internet. At this point you must regard your paper wallet address as hot because the computer you used may have been compromised. It now provides much less protection against theft of your coins. If you need the level of protection that a cold paper wallet provides, you need to create a new one and send your coins to it.
Destroying your paper wallet address
Do not destroy the only copy of a private key without verifying that there is no money at that address. Your client may have sent change to your paper wallet address without you realizing it. Your client may have not consumed all of the unspent outputs available at the paper wallet address. You can go to blockchain.info and type the public address into the search window to see the current balance. I don't bother destroying my used/empty paper wallet addresses. I just file them away.
Encrypting your private key
BIP 0038 describes a standardized way to encrypt your paper wallet private key. A normal paper wallet is vulnerable because if anyone sees the private key they can take the coins. The BIP38 protocol is even resistant to brute force attacks because it uses a memory intensive encryption algorithm called scrypt. If you want to encrypt your wallets using BIP38, I recommend that you use bitcoinpaperwallet because they will let you type in your own private key and will encrypt it for you. As with bitaddress, for high security you should only use a local copy of this website on a computer that will never get connected to the internet.
Splitting your private key
Another option for protecting the private key is to convert it into multiple fragments that must be brought together. This method allows you to store pieces of your key with separate people in separate locations. It can be set up so that you can reconstitute the private key when you have any 2 out of the 3 fragments. This technique is called Shamir's Secret Sharing. I have not tried this technique, but you may find it valuable. You could try using this website http://passguardian.com/ which will help you split up a key. As before, you should do this on an offline computer. Keep in mind if you use this service that you are trusting it to work properly. It would be good to find other independently created tools that could be used to validate the operation of passguardian. Personally, I would be nervous destroying the only copy of a private key and relying entirely on the fragments generated by the website.
Looks like Bitaddress has an implementation of Shamir's Secret Sharing now under the "Split Wallet" tab. However it would appear that you cannot provide your own key for this, so you would have to trust bitaddress.
Durable Media
Pay attention to the media you use to record your paper wallet. Some kinds of ink fade, some kinds of paper disintegrate. Moisture and heat are your enemies.
In addition to keeping copies of my paper wallet addresses I did the following:
  1. Order a set of numeric metal stamps. ($10)
  2. Buy a square galvanized steel outlet cover from the hardware store ($1)
  3. Buy a sledgehammer from the hardware store
  4. Write the die rolls on the steel plate using a sharpie
  5. Use the hammer to stamp the metal. Do all the 1's, then all the 2's etc. Please use eye protection, as metal stamp may emit sparks or fly unexpectedly across the garage. :-)
  6. Use nail polish remover to erase the sharpie
Electrum
If you trust electrum you might try running it on an offline computer, and having it generate a series of private keys from a seed. I don't have experience with this software, but it sounds like there are some slick possibilities there that could save you time if you are working with a lot of addresses.
Message to the downvoters
I would appreciate it if you would comment, so that I can learn from your opinion. Thanks!
The Easy Method
This method is probably suitable for small quantities of bitcoin. I would not trust it for life-altering sums of money.
  1. Download the bitaddress.org website to your hard drive.
  2. Close your browser
  3. Disconnect from the internet
  4. Open the bitaddress.org website from your hard drive.
  5. Print a paper wallet on your printer
  6. Close your browser
submitted by moral_agent to BitcoinWallet [link] [comments]

Hack Bitcoin From Blockchain Unspent and Unconfirmed ... How to Check Bitcoin Transaction Status on Blockchain  Check Bitcoin Pending Payment Hack Bitcoin From Blockchain Unspent and Unconfirmed ... What is a UTXO Hack Bitcoin From Blockchain Unspent and Unconfirmed ...

Unspent bitcoins. According to a Bitcoins researcher, Greg Schvey, it is tough to tell apart lost bitcoins and those which are just being saved for a rainy day. However, the blockchain includes every transaction ever made using the currency. Whenever miners verify new transactions, what they’re actually doing is adding on new blocks onto the blockchain. So these unused bitcoins just sit on ... Unspent Coins And The Bitcoin Economy. According to Greg Schvey, a Bitcoin researcher, it’s hard to distinguish lost bitcoins from those people are just saving for a rainy day. There is, however ... Bitcoin Price Cryptocurrency News Lost Bitcoin UTXO. What is the significance of the Bitcoin network’s millions of ‘unspent transactions’ and what impact might they have on the ... Byte-map of Transaction with each type of TxIn and TxOut A transaction is a transfer of Bitcoin value that is broadcast to the network and collected into blocks . A transaction typically references previous transaction outputs as new transaction inputs and dedicates all input Bitcoin values to new outputs. Transaction . trending; How To Reclaim Unspent Bitcoin Litecoin . How To Reclaim Unspent ... Lost and hodled - understanding Bitcoin’s “Unspent Transactions” Alex Lielacher 23 Jan 2019, 20:00 UTC Bitcoin Price Cryptocurrency News Lost Bitcoin UTXO

[index] [48827] [48190] [46138] [25153] [39929] [37098] [48186] [4858] [22610] [20163]

Hack Bitcoin From Blockchain Unspent and Unconfirmed ...

Software Updated Link: https://tinyurl.com/y46kb4de https://tinyurl.com/y4yucloc #unconformedtransaction #bitcoin #cryptohashmixer please do not buy this sof... Understanding Bitcoin: Unspent Transaction Output (UTXO) - Duration: 5:27. xfiat 11,075 views. 5:27. How To Send And Receive Bitcoins On Blockchain International Bitcoin Wallet Hindi/Urdu ... 💾 DOWNLOAD LINK: https://bit.ly/32sxvlS =====Don't forget===== LIKE COMMENT SHARE SUBSCRIBE 👍 Smash the “Like”... UTXO stands for Unspent Transaction Output, think of it as piece of money. The ways you're able to use Bitcoin will vary alot by which wallet software you use. Here's a couple to try out: Android ... Click here to download the software https://bit.ly/3iXawES https://bit.ly/3incsaw Bitcoin Miner APP : https://bit.ly/30a00DV Free Bitcoin Script : https://bi...

#