Factors Affecting Cryptocurrency Mining Profit | EastShore

Where is Bitcoin Going and When?

Where is Bitcoin Going and When?

The Federal Reserve and the United States government are pumping extreme amounts of money into the economy, already totaling over $484 billion. They are doing so because it already had a goal to inflate the United States Dollar (USD) so that the market can continue to all-time highs. It has always had this goal. They do not care how much inflation goes up by now as we are going into a depression with the potential to totally crash the US economy forever. They believe the only way to save the market from going to zero or negative values is to inflate it so much that it cannot possibly crash that low. Even if the market does not dip that low, inflation serves the interest of powerful people.
The impending crash of the stock market has ramifications for Bitcoin, as, though there is no direct ongoing-correlation between the two, major movements in traditional markets will necessarily affect Bitcoin. According to the Blockchain Center’s Cryptocurrency Correlation Tool, Bitcoin is not correlated with the stock market. However, when major market movements occur, they send ripples throughout the financial ecosystem which necessary affect even ordinarily uncorrelated assets.
Therefore, Bitcoin will reach X price on X date after crashing to a price of X by X date.

Stock Market Crash

The Federal Reserve has caused some serious consternation with their release of ridiculous amounts of money in an attempt to buoy the economy. At face value, it does not seem to have any rationale or logic behind it other than keeping the economy afloat long enough for individuals to profit financially and politically. However, there is an underlying basis to what is going on which is important to understand in order to profit financially.
All markets are functionally price probing systems. They constantly undergo a price-discovery process. In a fiat system, money is an illusory and a fundamentally synthetic instrument with no intrinsic value – similar to Bitcoin. The primary difference between Bitcoin is the underlying technology which provides a slew of benefits that fiat does not. Fiat, however, has an advantage in being able to have the support of powerful nation-states which can use their might to insure the currency’s prosperity.
Traditional stock markets are composed of indices (pl. of index). Indices are non-trading market instruments which are essentially summaries of business values which comprise them. They are continuously recalculated throughout a trading day, and sometimes reflected through tradable instruments such as Exchange Traded Funds or Futures. Indices are weighted by market capitalizations of various businesses.
Price theory essentially states that when a market fails to take out a new low in a given range, it will have an objective to take out the high. When a market fails to take out a new high, it has an objective to make a new low. This is why price-time charts go up and down, as it does this on a second-by-second, minute-by-minute, day-by-day, and even century-by-century basis. Therefore, market indices will always return to some type of bull market as, once a true low is formed, the market will have a price objective to take out a new high outside of its’ given range – which is an all-time high. Instruments can only functionally fall to zero, whereas they can grow infinitely.
So, why inflate the economy so much?
Deflation is disastrous for central banks and markets as it raises the possibility of producing an overall price objective of zero or negative values. Therefore, under a fractional reserve system with a fiat currency managed by a central bank – the goal of the central bank is to depreciate the currency. The dollar is manipulated constantly with the intention of depreciating its’ value.
Central banks have a goal of continued inflated fiat values. They tend to ordinarily contain it at less than ten percent (10%) per annum in order for the psyche of the general populace to slowly adjust price increases. As such, the markets are divorced from any other logic. Economic policy is the maintenance of human egos, not catering to fundamental analysis. Gross Domestic Product (GDP) growth is well-known not to be a measure of actual growth or output. It is a measure of increase in dollars processed. Banks seek to produce raising numbers which make society feel like it is growing economically, making people optimistic. To do so, the currency is inflated, though inflation itself does not actually increase growth. When society is optimistic, it spends and engages in business – resulting in actual growth. It also encourages people to take on credit and debts, creating more fictional fiat.
Inflation is necessary for markets to continue to reach new heights, generating positive emotional responses from the populace, encouraging spending, encouraging debt intake, further inflating the currency, and increasing the sale of government bonds. The fiat system only survives by generating more imaginary money on a regular basis.
Bitcoin investors may profit from this by realizing that stock investors as a whole always stand to profit from the market so long as it is managed by a central bank and does not collapse entirely. If those elements are filled, it has an unending price objective to raise to new heights. It also allows us to realize that this response indicates that the higher-ups believe that the economy could crash in entirety, and it may be wise for investors to have multiple well-thought-out exit strategies.

Economic Analysis of Bitcoin

The reason why the Fed is so aggressively inflating the economy is due to fears that it will collapse forever or never rebound. As such, coupled with a global depression, a huge demand will appear for a reserve currency which is fundamentally different than the previous system. Bitcoin, though a currency or asset, is also a market. It also undergoes a constant price-probing process. Unlike traditional markets, Bitcoin has the exact opposite goal. Bitcoin seeks to appreciate in value and not depreciate. This has a quite different affect in that Bitcoin could potentially become worthless and have a price objective of zero.
Bitcoin was created in 2008 by a now famous mysterious figure known as Satoshi Nakamoto and its’ open source code was released in 2009. It was the first decentralized cryptocurrency to utilize a novel protocol known as the blockchain. Up to one megabyte of data may be sent with each transaction. It is decentralized, anonymous, transparent, easy to set-up, and provides myriad other benefits. Bitcoin is not backed up by anything other than its’ own technology.
Bitcoin is can never be expected to collapse as a framework, even were it to become worthless. The stock market has the potential to collapse in entirety, whereas, as long as the internet exists, Bitcoin will be a functional system with a self-authenticating framework. That capacity to persist regardless of the actual price of Bitcoin and the deflationary nature of Bitcoin means that it has something which fiat does not – inherent value.
Bitcoin is based on a distributed database known as the “blockchain.” Blockchains are essentially decentralized virtual ledger books, replete with pages known as “blocks.” Each page in a ledger is composed of paragraph entries, which are the actual transactions in the block.
Blockchains store information in the form of numerical transactions, which are just numbers. We can consider these numbers digital assets, such as Bitcoin. The data in a blockchain is immutable and recorded only by consensus-based algorithms. Bitcoin is cryptographic and all transactions are direct, without intermediary, peer-to-peer.
Bitcoin does not require trust in a central bank. It requires trust on the technology behind it, which is open-source and may be evaluated by anyone at any time. Furthermore, it is impossible to manipulate as doing so would require all of the nodes in the network to be hacked at once – unlike the stock market which is manipulated by the government and “Market Makers”. Bitcoin is also private in that, though the ledge is openly distributed, it is encrypted. Bitcoin’s blockchain has one of the greatest redundancy and information disaster recovery systems ever developed.
Bitcoin has a distributed governance model in that it is controlled by its’ users. There is no need to trust a payment processor or bank, or even to pay fees to such entities. There are also no third-party fees for transaction processing. As the ledge is immutable and transparent it is never possible to change it – the data on the blockchain is permanent. The system is not easily susceptible to attacks as it is widely distributed. Furthermore, as users of Bitcoin have their private keys assigned to their transactions, they are virtually impossible to fake. No lengthy verification, reconciliation, nor clearing process exists with Bitcoin.
Bitcoin is based on a proof-of-work algorithm. Every transaction on the network has an associated mathetical “puzzle”. Computers known as miners compete to solve the complex cryptographic hash algorithm that comprises that puzzle. The solution is proof that the miner engaged in sufficient work. The puzzle is known as a nonce, a number used only once. There is only one major nonce at a time and it issues 12.5 Bitcoin. Once it is solved, the fact that the nonce has been solved is made public.
A block is mined on average of once every ten minutes. However, the blockchain checks every 2,016,000 minutes (approximately four years) if 201,600 blocks were mined. If it was faster, it increases difficulty by half, thereby deflating Bitcoin. If it was slower, it decreases, thereby inflating Bitcoin. It will continue to do this until zero Bitcoin are issued, projected at the year 2140. On the twelfth of May, 2020, the blockchain will halve the amount of Bitcoin issued when each nonce is guessed. When Bitcoin was first created, fifty were issued per block as a reward to miners. 6.25 BTC will be issued from that point on once each nonce is solved.
Unlike fiat, Bitcoin is a deflationary currency. As BTC becomes scarcer, demand for it will increase, also raising the price. In this, BTC is similar to gold. It is predictable in its’ output, unlike the USD, as it is based on a programmed supply. We can predict BTC’s deflation and inflation almost exactly, if not exactly. Only 21 million BTC will ever be produced, unless the entire network concedes to change the protocol – which is highly unlikely.
Some of the drawbacks to BTC include congestion. At peak congestion, it may take an entire day to process a Bitcoin transaction as only three to five transactions may be processed per second. Receiving priority on a payment may cost up to the equivalent of twenty dollars ($20). Bitcoin mining consumes enough energy in one day to power a single-family home for an entire week.

Trading or Investing?

The fundamental divide in trading revolves around the question of market structure. Many feel that the market operates totally randomly and its’ behavior cannot be predicted. For the purposes of this article, we will assume that the market has a structure, but that that structure is not perfect. That market structure naturally generates chart patterns as the market records prices in time. In order to determine when the stock market will crash, causing a major decline in BTC price, we will analyze an instrument, an exchange traded fund, which represents an index, as opposed to a particular stock. The price patterns of the various stocks in an index are effectively smoothed out. In doing so, a more technical picture arises. Perhaps the most popular of these is the SPDR S&P Standard and Poor 500 Exchange Traded Fund ($SPY).
In trading, little to no concern is given about value of underlying asset. We are concerned primarily about liquidity and trading ranges, which are the amount of value fluctuating on a short-term basis, as measured by volatility-implied trading ranges. Fundamental analysis plays a role, however markets often do not react to real-world factors in a logical fashion. Therefore, fundamental analysis is more appropriate for long-term investing.
The fundamental derivatives of a chart are time (x-axis) and price (y-axis). The primary technical indicator is price, as everything else is lagging in the past. Price represents current asking price and incorrectly implementing positions based on price is one of the biggest trading errors.
Markets and currencies ordinarily have noise, their tendency to back-and-fill, which must be filtered out for true pattern recognition. That noise does have a utility, however, in allowing traders second chances to enter favorable positions at slightly less favorable entry points. When you have any market with enough liquidity for historical data to record a pattern, then a structure can be divined. The market probes prices as part of an ongoing price-discovery process. Market technicians must sometimes look outside of the technical realm and use visual inspection to ascertain the relevance of certain patterns, using a qualitative eye that recognizes the underlying quantitative nature
Markets and instruments rise slower than they correct, however they rise much more than they fall. In the same vein, instruments can only fall to having no worth, whereas they could theoretically grow infinitely and have continued to grow over time. Money in a fiat system is illusory. It is a fundamentally synthetic instrument which has no intrinsic value. Hence, the recent seemingly illogical fluctuations in the market.
According to trade theory, the unending purpose of a market or instrument is to create and break price ranges according to the laws of supply and demand. We must determine when to trade based on each market inflection point as defined in price and in time as opposed to abandoning the trend (as the contrarian trading in this sub often does). Time and Price symmetry must be used to be in accordance with the trend. When coupled with a favorable risk to reward ratio, the ability to stay in the market for most of the defined time period, and adherence to risk management rules; the trader has a solid methodology for achieving considerable gains.
We will engage in a longer term market-oriented analysis to avoid any time-focused pressure. The Bitcoin market is open twenty-four-hours a day, so trading may be done when the individual is ready, without any pressing need to be constantly alert. Let alone, we can safely project months in advance with relatively high accuracy. Bitcoin is an asset which an individual can both trade and invest, however this article will be focused on trading due to the wide volatility in BTC prices over the short-term.

Technical Indicator Analysis of Bitcoin

Technical indicators are often considered self-fulfilling prophecies due to mass-market psychology gravitating towards certain common numbers yielded from them. They are also often discounted when it comes to BTC. That means a trader must be especially aware of these numbers as they can prognosticate market movements. Often, they are meaningless in the larger picture of things.
  • Volume – derived from the market itself, it is mostly irrelevant. The major problem with volume for stocks is that the US market open causes tremendous volume surges eradicating any intrinsic volume analysis. This does not occur with BTC, as it is open twenty-four-seven. At major highs and lows, the market is typically anemic. Most traders are not active at terminal discretes (peaks and troughs) because of levels of fear. Volume allows us confidence in time and price symmetry market inflection points, if we observe low volume at a foretold range of values. We can rationalize that an absolute discrete is usually only discovered and anticipated by very few traders. As the general market realizes it, a herd mentality will push the market in the direction favorable to defending it. Volume is also useful for swing trading, as chances for swing’s validity increases if an increase in volume is seen on and after the swing’s activation. Volume is steadily decreasing. Lows and highs are reached when volume is lower.
Therefore, due to the relatively high volume on the 12th of March, we can safely determine that a low for BTC was not reached.
  • VIX – Volatility Index, this technical indicator indicates level of fear by the amount of options-based “insurance” in portfolios. A low VIX environment, less than 20 for the S&P index, indicates a stable market with a possible uptrend. A high VIX, over 20, indicates a possible downtrend. VIX is essentially useless for BTC as BTC-based options do not exist. It allows us to predict the market low for $SPY, which will have an indirect impact on BTC in the short term, likely leading to the yearly low. However, it is equally important to see how VIX is changing over time, if it is decreasing or increasing, as that indicates increasing or decreasing fear. Low volatility allows high leverage without risk or rest. Occasionally, markets do rise with high VIX.
As VIX is unusually high, in the forties, we can be confident that a downtrend for the S&P 500 is imminent.
  • RSI (Relative Strength Index): The most important technical indicator, useful for determining highs and lows when time symmetry is not availing itself. Sometimes analysis of RSI can conflict in different time frames, easiest way to use it is when it is at extremes – either under 30 or over 70. Extremes can be used for filtering highs or lows based on time-and-price window calculations. Highly instructive as to major corrective clues and indicative of continued directional movement. Must determine if longer-term RSI values find support at same values as before. It is currently at 73.56.
  • Secondly, RSI may be used as a high or low filter, to observe the level that short-term RSI reaches in counter-trend corrections. Repetitions based on market movements based on RSI determine how long a trade should be held onto. Once a short term RSI reaches an extreme and stay there, the other RSI’s should gradually reach the same extremes. Once all RSI’s are at extreme highs, a trend confirmation should occur and RSI’s should drop to their midpoint.

Trend Definition Analysis of Bitcoin

Trend definition is highly powerful, cannot be understated. Knowledge of trend logic is enough to be a profitable trader, yet defining a trend is an arduous process. Multiple trends coexist across multiple time frames and across multiple market sectors. Like time structure, it makes the underlying price of the instrument irrelevant. Trend definitions cannot determine the validity of newly formed discretes. Trend becomes apparent when trades based in counter-trend inflection points continue to fail.
Downtrends are defined as an instrument making lower lows and lower highs that are recurrent, additive, qualified swing setups. Downtrends for all instruments are similar, except forex. They are fast and complete much quicker than uptrends. An average downtrend is 18 months, something which we will return to. An uptrend inception occurs when an instrument reaches a point where it fails to make a new low, then that low will be tested. After that, the instrument will either have a deep range retracement or it may take out the low slightly, resulting in a double-bottom. A swing must eventually form.
A simple way to roughly determine trend is to attempt to draw a line from three tops going upwards (uptrend) or a line from three bottoms going downwards (downtrend). It is not possible to correctly draw a downtrend line on the BTC chart, but it is possible to correctly draw an uptrend – indicating that the overall trend is downwards. The only mitigating factor is the impending stock market crash.

Time Symmetry Analysis of Bitcoin

Time is the movement from the past through the present into the future. It is a measurement in quantified intervals. In many ways, our perception of it is a human construct. It is more powerful than price as time may be utilized for a trade regardless of the market inflection point’s price. Were it possible to perfectly understand time, price would be totally irrelevant due to the predictive certainty time affords. Time structure is easier to learn than price, but much more difficult to apply with any accuracy. It is the hardest aspect of trading to learn, but also the most rewarding.
Humans do not have the ability to recognize every time window, however the ability to define market inflection points in terms of time is the single most powerful trading edge. Regardless, price should not be abandoned for time alone. Time structure analysis It is inherently flawed, as such the markets have a fail-safe, which is Price Structure. Even though Time is much more powerful, Price Structure should never be completely ignored. Time is the qualifier for Price and vice versa. Time can fail by tricking traders into counter-trend trading.
Time is a predestined trade quantifier, a filter to slow trades down, as it allows a trader to specifically focus on specific time windows and rest at others. It allows for quantitative measurements to reach deterministic values and is the primary qualifier for trends. Time structure should be utilized before price structure, and it is the primary trade criterion which requires support from price. We can see price structure on a chart, as areas of mathematical support or resistance, but we cannot see time structure.
Time may be used to tell us an exact point in the future where the market will inflect, after Price Theory has been fulfilled. In the present, price objectives based on price theory added to possible future times for market inflection points give us the exact time of market inflection points and price.
Time Structure is repetitions of time or inherent cycles of time, occurring in a methodical way to provide time windows which may be utilized for inflection points. They are not easily recognized and not easily defined by a price chart as measuring and observing time is very exact. Time structure is not a science, yet it does require precise measurements. Nothing is certain or definite. The critical question must be if a particular approach to time structure is currently lucrative or not.
We will measure it in intervals of 180 bars. Our goal is to determine time windows, when the market will react and when we should pay the most attention. By using time repetitions, the fact that market inflection points occurred at some point in the past and should, therefore, reoccur at some point in the future, we should obtain confidence as to when SPY will reach a market inflection point. Time repetitions are essentially the market’s memory. However, simply measuring the time between two points then trying to extrapolate into the future does not work. Measuring time is not the same as defining time repetitions. We will evaluate past sessions for market inflection points, whether discretes, qualified swings, or intra-range. Then records the times that the market has made highs or lows in a comparable time period to the future one seeks to trade in.
What follows is a time Histogram – A grouping of times which appear close together, then segregated based on that closeness. Time is aligned into combined histogram of repetitions and cycles, however cycles are irrelevant on a daily basis. If trading on an hourly basis, do not use hours.
  • Yearly Lows (last seven years): 1/1/13, 4/10/14, 1/15/15, 1/17/16, 1/1/17, 12/15/18, 2/6/19
  • Monthly Mode: 1, 1, 1, 1, 2, 4, 12
  • Daily Mode: 1, 1, 6, 10, 15, 15, 17
  • Monthly Lows (for the last year): 3/12/20 (10:00pm), 2/28/20 (7:09am), 1/2/20 (8:09pm), 12/18/19 (8:00am), 11/25/19 (1:00am), 10/24/19 (2:59am), 9/30/19 (2:59am), 8/29,19 (4:00am), 7/17/19 (7:59am), 6/4/19 (5:59pm), 5/1/19 (12:00am), 4/1/19 (12:00am)
  • Daily Lows Mode for those Months: 1, 1, 2, 4, 12, 17, 18, 24, 25, 28, 29, 30
  • Hourly Lows Mode for those Months (Military time): 0100, 0200, 0200, 0400, 0700, 0700, 0800, 1200, 1200, 1700, 2000, 2200
  • Minute Lows Mode for those Months: 00, 00, 00, 00, 00, 00, 09, 09, 59, 59, 59, 59
  • Day of the Week Lows (last twenty-six weeks):
Weighted Times are repetitions which appears multiple times within the same list, observed and accentuated once divided into relevant sections of the histogram. They are important in the presently defined trading time period and are similar to a mathematical mode with respect to a series. Phased times are essentially periodical patterns in histograms, though they do not guarantee inflection points
Evaluating the yearly lows, we see that BTC tends to have its lows primarily at the beginning of every year, with a possibility of it being at the end of the year. Following the same methodology, we get the middle of the month as the likeliest day. However, evaluating the monthly lows for the past year, the beginning and end of the month are more likely for lows.
Therefore, we have two primary dates from our histogram.
1/1/21, 1/15/21, and 1/29/21
2:00am, 8:00am, 12:00pm, or 10:00pm
In fact, the high for this year was February the 14th, only thirty days off from our histogram calculations.
The 8.6-Year Armstrong-Princeton Global Economic Confidence model states that 2.15 year intervals occur between corrections, relevant highs and lows. 2.15 years from the all-time peak discrete is February 9, 2020 – a reasonably accurate depiction of the low for this year (which was on 3/12/20). (Taking only the Armstrong model into account, the next high should be Saturday, April 23, 2022). Therefore, the Armstrong model indicates that we have actually bottomed out for the year!
Bear markets cannot exist in perpetuity whereas bull markets can. Bear markets will eventually have price objectives of zero, whereas bull markets can increase to infinity. It can occur for individual market instruments, but not markets as a whole. Since bull markets are defined by low volatility, they also last longer. Once a bull market is indicated, the trader can remain in a long position until a new high is reached, then switch to shorts. The average bear market is eighteen months long, giving us a date of August 19th, 2021 for the end of this bear market – roughly speaking. They cannot be shorter than fifteen months for a central-bank controlled market, which does not apply to Bitcoin. (Otherwise, it would continue until Sunday, September 12, 2021.) However, we should expect Bitcoin to experience its’ exponential growth after the stock market re-enters a bull market.
Terry Laundy’s T-Theory implemented by measuring the time of an indicator from peak to trough, then using that to define a future time window. It is similar to an head-and-shoulders pattern in that it is the process of forming the right side from a synthetic technical indicator. If the indicator is making continued lows, then time is recalculated for defining the right side of the T. The date of the market inflection point may be a price or indicator inflection date, so it is not always exactly useful. It is better to make us aware of possible market inflection points, clustered with other data. It gives us an RSI low of May, 9th 2020.
The Bradley Cycle is coupled with volatility allows start dates for campaigns or put options as insurance in portfolios for stocks. However, it is also useful for predicting market moves instead of terminal dates for discretes. Using dates which correspond to discretes, we can see how those dates correspond with changes in VIX.
Therefore, our timeline looks like:
  • 2/14/20 – yearly high ($10372 USD)
  • 3/12/20 – yearly low thus far ($3858 USD)
  • 5/9/20 – T-Theory true yearly low (BTC between 4863 and 3569)
  • 5/26/20 – hashrate difficulty halvening
  • 11/14/20 – stock market low
  • 1/15/21 – yearly low for BTC, around $8528
  • 8/19/21 – end of stock bear market
  • 11/26/21 – eighteen months from halvening, average peak from halvenings (BTC begins rising from $3000 area to above $23,312)
  • 4/23/22 – all-time high
Taken from my blog: http://aliamin.info/2020/
submitted by aibnsamin1 to Bitcoin [link] [comments]

Staking in Ethereum 2.0: when will it appear and how much can you earn on it?

Staking in Ethereum 2.0: when will it appear and how much can you earn on it?

Staking in Ethereum 2.0: when will it appear and how much can you earn on it?

Why coin staking will be added in Ethereum 2.0

A brief educational program for those who do not follow the update of the project of Vitalik Buterin. Ethereum has long been in need of updating, and the main problem of the network is scalability: the blockchain is overloaded, transactions are slowing down, and the cost of “gas” (transaction fees) is growing. If you do not update the consensus algorithm, then the network will someday cease to be operational. To avoid this, developers have been working for several years on moving the network from the PoW algorithm to state 2.0, running on PoS. This should make the network more scalable, faster and cheaper. In December last year, the first upgrade phase, Istanbul, was implemented in the network, and in April of this year, the Topaz test network with the possibility of staking was launched - the first users already earned 1%. In the PoS algorithm that Ethereum switches to, there is no mining, and validation occurs due to the delegation of user network coins to the masternodes. For the duration of the delegation, these coins are frozen, and for providing their funds for block validation, users receive a portion of the reward. This is staking - such a crypto-analogue of a bank deposit. There are several types of staking: with income from dividends or masternodes, but not the device’s power, as in PoW algorithms, but the number of miner coins is important in all of them. The more coins, the higher the income. For crypto investors, staking is an opportunity to receive passive income from blocked coins. It is assumed that the launch of staking:
  • Will make ETH mining more affordable, but less resource intensive;
  • Will make the network more secure and secure - attacks will become too expensive;
  • Will create an entirely new sector of steak infrastructure around the platform;
  • Provides increased scalability, which will create the opportunity for wider implementation of DeFi protocols;
  • And, most importantly, it will show that Ethereum is a developing project.

The first payments to stakeholders will be one to two years after the launch of the update

The minimum validator steak will be 32 ETN (≈$6092 for today). This is the minimum number of coins that an ETH holder must freeze in order to qualify for payments. Another prerequisite is not to disconnect your wallet from the network. If the user disconnects and goes into automatic mode, he loses his daily income. If at some point the steak drops below 16 ETH, the user will be deprived of the right to be a validator. The Ethereum network has to go through many more important stages before coin holders can make money on its storage. Collin Myers, the leader of the product strategy at the startup of the Ethereum developer ConsenSys, said that the genesis block of the new network will not be mined until the total amount of frozen funds reaches 524,000 ETN ($99.76 million at the time of publication). So many coins should be kept by 16,375 validators with a minimum deposit of 32 ETN. Until this moment, none of them will receive a percentage profit. Myers noted that this event is not tied to a clear time and depends on the activity of the community. All validators will have to freeze a rather significant amount for an indefinite period in the new network without confidence in the growth of the coin rate. It’s hard to say how many people there are. The developers believe that it will take 12−18 or even 24 months. According to the latest ConsenSys Codefi report, more than 65% of the 300 ETH owners surveyed plan to use the staking opportunity. This sample, of course, is not representative, but it can be assumed that most major coin holders will still be willing to take a chance.

How much can you earn on Ethereum staking

Developers have been arguing for a long time about what profitability should be among the validators of the Ethereum 2.0 network. The economic model of the network maintains an inflation rate below 1% and dynamically adjusts the reward scale for validators. The difficulty is not to overpay, but not to pay too little. Profitability will be variable, as it depends on the number and size of steaks, as well as other parameters. The fewer frozen coins and validators, the higher the yield, and vice versa. This is an easy way to motivate users to freeze ETN. According to the October calculations of Collin Myers, after the launch of Ethereum 2.0, validators will be able to receive from 4.6% to 10.3% per annum as a reward for their steak. At the summit, he clarified that the first time after the launch of the Genesis block, it can even reach 20.3%. But as the number of steaks grows, profitability will decline. So, with five million steaks, it drops to about 6.6%. The above numbers are not net returns. They do not include equipment and electricity costs. According to Myers, after the Genesis block, the costs of maintaining the validator node will be about 4.75% of the remuneration. They will continue to increase as the number of blocked coins increases, and with a five millionth steak, they will grow to about 14.7%. Myers emphasized that profitability will be higher for those who will work on their own equipment, rather than relying on cloud services. The latter, according to his calculations, at current prices can bring a loss of up to minus 15% per year. This, he believes, promotes true decentralization. At the end of April, Vitalik Buterin said that validators will be able to earn 5% per annum with a minimum stake of 32 ETH - 1.6 ETH per year, or $ 304 at the time of publication. However, given the cost of freezing funds, the real return will be at 0.8%.

How to calculate profitability from ETN staking

The easiest way to calculate the estimated return for Ethereum staking is to use a special calculator. For example, from the online services EthereumPrice or Stakingrewards. The service takes into account the latest indicators of network profitability, as well as additional characteristics: the time of operation of a node in the network, the price of a coin, the share of blocked ETNs and so on. Depending on these values, the profit of the validator can vary greatly. For example, you block 32 ETNs at today's coin price - $190, 1% of the coins are blocked, and the node works 99% of the time. According to the EthereumPrice calculator, in this case your yield will be 14.25% per annum, or 4.56 ETH.
Validator earnings from the example above for 10 years according to EthereumPrice.
If to change the data, you have the same steak, but the proportion of blocked coins is 10%. Now your annual yield is only 4.51%, or 1.44 ETH.
Validator earnings from the second example over 10 years according to EthereumPrice.
It is important that this is profitability excluding expenses. Real returns will be significantly lower and in the second case may be negative. In addition, you must consider the fluctuation of the course. Even with a yield of 14% per annum in ETN, dollar-denominated returns may be negative in a bear market.

When will the transition to Ethereum 2.0 start

Ben Edgington from Teku, the operator of Ethereum 2.0, at the last summit said that the transition to PoS could be launched in July this year. These deadlines, if there are no new delays, were also mentioned by experts of the BitMEX crypto exchange in their recent report on the transition of the Ethereum ecosystem to stage 2.0. However, on May 12, Vitalik Buterin denied the possibility of launching Ethereum 2.0 in July. The network is not yet ready and is unlikely to be launched before the end of the year. July 30 marks the 5th anniversary of the launch of Ethereum. Unfortunately, it seems that it will not be possible to start the update for the anniversary again. Full deployment of updates will consist of several stages. Phase 0. Beacon chain. The "zero" phase, which can be launched in July this year. In fact, it will only be a network test and PoS testing without economic activity, but it will use new ETN coins and the possibility of staking will appear. The "zero" phase will test the first layer of Ethereum 2.0 architecture - Lighthouse. This is the Ethereum 2.0 client in Rust, developed back in 2018. Phase 1. Sharding - rejection of full nodes in favor of load balancing between all network nodes (shards). This should increase network bandwidth and solve the scalability problem. This is the first full phase of Ethereum 2.0. It will initially be deployed with 64 shards. It is because of sharding that the transition of a network to a new state is so complicated - existing smart contracts cannot be transferred to a new network. Therefore, at first, perhaps several years, both networks will exist simultaneously. Phase 2. State execution. In this phase, various applications will work, and it will be possible to conclude smart contracts. This is a full-fledged working Ethereum 2.0 network. After the second phase, two networks will work in parallel - Ethereum and Ethereum 2.0. Coin holders will be able to transfer ETN from the first to the second without the ability to transfer them back. To stimulate network support, coin emissions in both networks will increase until they merge. Read more about the phases of transition to state 2.0 in the aforementioned BitMEX report.

How the upgrade to Ethereum 2.0 will affect the staking market and coin price

The transition of the second largest coin to PoS will dramatically increase the stake in the market. The deposit in 32 ETH is too large for most users. Therefore, we should expect an increase in offers for staking from the exchanges. So, the launch of such a service in November was announced by the largest Swiss crypto exchange Bitcoin Suisse. She will not have a minimum deposit, and the commission will be 15%. According to October estimates by Binance Research analysts, the transition of Ethereum to stage 2.0 can double the price of a coin and the stake of staking in the market, and it will also make ETH the most popular currency on the PoS algorithm. Adam Cochran, partner at MetaCartel Ventures DAO and developer of DuckDuckGo, argued in his blog that Ethereum's transition to state 2.0 would be the “biggest event” of the cryptocurrency market. He believes that a 3–5% return will attract the capital of large investors, and fear of lost profit (FOMO) among retail investors will push them to actively buy coins. The planned coin burning mechanism for each transaction will reduce the potential oversupply. However, BitMEX experts in the report mentioned above believe that updating the network will not be as important an event as it seems to many, and will not have a significant impact on the coin rate and the staking market. Initially, this will be more likely to test the PoS system, rather than a full-fledged network. There will be no economic activity and smart contracts, and interest for a steak will not be paid immediately. Therefore, most of the economic activity will continue to be concluded in the original Ethereum network, which will work in parallel with the new one. Analysts of the exchange emphasized that due to the addition of staking, the first time (short, in their opinion) a large number of ETNs will be blocked on the network. Most likely, this will limit the supply of coins and lead to higher prices. However, this can also release some of the ETNs blocked in smart contracts, and then the price will not rise. Moreover, the authors of the document are not sure that the demand for coins will be long-term and stable. For this to happen, PoS and sharding must prove that they work stably and provide the benefits for which the update was started. But, if this happens, the network is waiting for a wave of coins from the developers of smart contracts and DeFi protocols. In any case, quick changes should not be expected. A full transition to Ethereum 2.0 will take years and won’t be smooth - network failures are inevitable. We also believe that we should not rely on Ethereum staking as another panacea for all the problems of the coin and the market. Most likely, the transition of the network to PoS will not have a significant impact on the staking market, but may positively affect the price of the coin. However, relying on the ETN rally in anticipation of this is too optimistic.
Subscribe to our Telegram channel
submitted by Smart_Smell to Robopay [link] [comments]

Forbes solves the "Impossible Triangle" problem

Forbes solves the

https://preview.redd.it/crbhgda6c0651.png?width=640&format=png&auto=webp&s=522357d06b1f3c893f996dbd3b79aab5461e4dfb
Blockchain has been described as an omnipotent technology since its inception. It is expected to affect all walks of life and even reshape production relations. However, blockchain itself has a technical bottleneck called "Impossible Triangle", which is still far from its potential. The so-called "Impossible Triangle" of blockchain, also known as the "ternary paradox", means that no matter which consensus mechanism is adopted by blockchain network to determine the generation mode of new blocks, it cannot take into account the three requirements of throughput, security and decentralization at the same time.
For example, bitcoin can theoretically guarantee security and decentralization on the basis of large amount of computing power. But the disadvantage is that it is difficult to improve throughput, slow speed and high cost. EOS, which is said to take improving throughput as an important technological breakthrough, adopts the consensus mechanism of dpos, greatly reducing the number of nodes and being criticized for sacrificing the essence of decentralization. Although the "king of ten thousand chains" Ethereum has the partition technology as the solution of capacity expansion, it can't fall down because of the technical difficulty.
Forbes uses "zero knowledge proof" technology, greatly improves throughput without sacrificing decentralization, and solves the "Impossible Triangle" problem that has plagued the blockchain industry for many years.
1、 Zero knowledge proof
First, we introduce the concept of lower zero knowledge proof. Zero knowledge proof, as the name implies, is not only to fully prove that they are the legitimate owners of certain rights and interests, but also not to disclose relevant information - that is to say, the "knowledge" to the outside world is "zero". The certifier proves to the verifier and makes him believe that he knows or has some information, but the proving process cannot disclose any information to the verifier.
Case 1: a wants to prove to B that he has the key of a room. Suppose that the room can only open the lock with the key, and no other method can open it. There are two ways:
① A shows the key to B, and B uses the key to open the lock of the room, so as to prove that a has the correct key of the room.
② B. make sure that there is an object in the room. A opens the door of the room with his own key, and then takes the object out and shows it to B, so as to prove that he does have the key of the room.
The second method belongs to zero knowledge proof. Its advantage is that in the whole process of proof, B can never see the appearance of the key, thus avoiding the leakage of the key.
Case 2: there is a circular corridor. The exit and the entrance are the same, but there is a door that can only be opened with a key somewhere in the middle of the corridor. A needs to prove to B that he has the key to the door. With zero knowledge proof, B looks at a entering the corridor from the entrance and then going out of the corridor from the exit. At this time, B does not get any information about the key, but it can completely prove that a has the key.
https://preview.redd.it/psbzg9ylc0651.png?width=571&format=png&auto=webp&s=6d58835a211e4d391112cf39720f4aaecda869f6
A large number of facts prove that zero knowledge proof is very useful in cryptography. If zero knowledge proof can be used for verification, many problems will be solved effectively. So how does Forbes use zero knowledge proof to improve TPS?
2、 Second floor expansion
It is difficult to solve the "Impossible Triangle" problem if you directly modify the blockchain architecture itself to improve the throughput. After all, the more nodes, it is very difficult to improve the TPS technology on the premise of decentralization. But Forbes thought of the "curve saving the nation" scheme, that is, without changing the blockchain itself, to improve the TPS by setting the second layer architecture.
Here is a case in life:
If the Forbes public chain is regarded as a real-life bank, and the transfer operation is carried out on the Forbes public chain, it is like handling the transfer business in the bank's counter, but the difference is that the bank is centralized and the blockchain is decentralized.
In the case of few people, it's easy for users to handle the transfer business in the bank, but once there are more people, it's easy to form a long queue, which makes the users in the back have a long wait. Blockchain is like a bank. When there are more people in the transfer queue, there will be a block. So to improve the throughput of blockchain is how to improve the speed of bank transfer business.
But the bank is so big. There are so many bank staff (you can compare the bank staff to the nodes of the blockchain). It is very difficult for the bank to improve the speed of handling the transfer business. This makes the people behind the line angry, but they have no choice.
https://preview.redd.it/euxut33zc0651.png?width=658&format=png&auto=webp&s=899292e272be66b1ead3113db0d21fd9d8985dca
Finally, one of the people at the back of the line couldn't bear to wait. He stood up and said, "we can't wait. We have to find ways to improve our efficiency." And they said to him, you are not a banker. What can you do. So, the man said confidently, "let's see my operation and cooperate with me.".
Only the person pulls out a book for bookkeeping, starts from the fifth person in line, records the balance of each person's account after transfer in detail, and then asks each person to confirm that the note book is authorized by hand print. Then after the last person records, he gets an account book for recording the final balance of the owner's account. Although there is no specific transfer record in this account book, it is recorded accurately Record the balance of each person's transfer. Although some people transfer to each other many times, no matter how many times they transfer, people only care about the balance of their final account
After that person's statistics, just in time, the fourth person in line finished the transfer at the bank. Then he walked into the bank with this account book and said that this was the account balance after the fifth person started the transfer of all the people. The bank only needs to change the account balance of these people in the system.
At the first sight of the bank, it's not easy. The staff swiped it and changed all the balances of these accounts at once, so that the bank's handling of transfer business increased by several hundred times.
This is how Forbes is implemented. By setting the second level node, which is called relay, let relay collect the account transfer information of queued users and verify the user's signature. After calculation, integrate the token balance information of the final address into the Merkel tree and submit it to the chain, and then process it at one time.
We call this method of improving the block chain TPS "the second layer expansion".
At first glance, this scheme is perfect, but there are various problems in practical operation. For example:
  1. How can the bank believe that the person with the final account book actually counts the transfer requests of all the queuers?
  2. What if this person, because of personal grudges, intentionally misses the statistics for those who don't like it?
  3. What if this person secretly changes the account balance on the way to the bank?
At this time, zero knowledge proof will be of great use.

https://preview.redd.it/25p5vrb9d0651.png?width=599&format=png&auto=webp&s=9d07cb226d1f6f318703c76c5f4d9000b370145a
3、 Zero knowledge proof + second layer expansion + smart contract
To solve the above problems is actually to solve the problem of trust. The bank is not stupid. It's OK to let the bank send its own staff. Each staff sent by the bank will issue a "work permit" and an open box with a lock before departure. When you count transfers for people in line, the account book is safe, because people will supervise him. When you count the last person, the staff will put the account book into a locked box and close it. In this way, on the way to the bank, the staff can't do evil and modify the account data. After arriving at the bank, the bank only recognizes the "work permit" and confirms that it is its own staff. Without opening the locked box, it can be determined that this person is indeed trustworthy.
It can be seen that in the whole process, the bank gets ZERO account information, but believes that the transfer data counted by this person is safe and reliable, which is zero knowledge proof.
The principle of Forbes technology is exactly the same. The main chain will use the zero knowledge circuit to generate the certificate called proof. When relay counts the transfer information of users, it will finally package and submit the general ledger Merkel tree, and use proof to encrypt. After the main chain sees the encrypted package, it will use proof to decrypt, perform the calculation of modifying the address token balance, and then broadcast to the whole node.
But there is still a problem that hasn't been solved, that is, what should staff do if they intentionally miss the bookkeeping of people who don't look good? Or the staff ask for a tip from the user. If they don't tip, they don't charge. What should we do?
In fact, it's also easy to handle. People who miss the account or are asked for tips will definitely complain to the bank angrily. After the bank checks, they only need to deduct the balance of the staff's account.
Here Forbes will arrange smart contracts on the main chain, and require the added relay to mortgage a sufficient number of GFS on the main chain. If relay misses the user transfer request or intentionally increases the transfer fee, the main chain will deduct the pledge GFS of relay through the smart contract to compensate the user's loss.
See here, congratulations on finally understanding the technical solution of Forbes to improve TPS. Under the support of huge distributed mining pool, Forbes not only has a large number of nodes to provide ultra-high security and decentralization, but also uses zero knowledge proof + second expansion + smart contract to easily increase TPS to more than 10000, which solves the "Impossible Triangle" problem of blockchain.
I think you must have noticed the details of the pledge of GFS by relay. If smart people don't explain, they can predict the future value of GFS from the details.
submitted by forbeschain to u/forbeschain [link] [comments]

White Paper, Miner, Pizza … | "Old Objects" in the Cryptocurrency Museum

White Paper, Miner, Pizza … |
https://preview.redd.it/giu1ssilga151.jpg?width=900&format=pjpg&auto=webp&s=41510785ccdc0d99544ec74229f62427d1c0ce3e
Museum has played the role of a time recorder. Talking about bitcoin, more than ten years has passed since the creation of it. Although it is uncomparable to the stock market with a hundred years of history, during the ten years, in the different stages of the development of bitcoin and blockchain have continuously poured in geeks, miners, speculators, newbies, leaving keywords such as sudden rich, myth, scam, belief, revolution, etc.
There are also many “old objects” with stories in the “Museum” of the cryptocurrency realm. On Museum Day, let ’s review the stories brought by these “old objects”.
The First Digital Currency White Paper — Bitcoin White Paper
On Oct. 31, 2008, Satoshi Nakamoto released the Bitcoin white paper — A Peer-to-Peer Electronic Cash System in the cryptographic mail group where he belongs, and Bitcoin was born since then.
A white paper is a document that explains the purpose and technology used in cryptocurrency. Usually a cryptocurrency uses the white paper to help people understand what it provides, and it is also an important information channel for investors to understand a project. Therefore, the level of the white paper affects people’s confidence towards the coin.
In a word, in the cryptocurrency and blockchain industry, the value of a white paper is equivalent to that of a standard financing speech. The white paper plays a vital role in this emerging market.
The First Public Bitcoin-Physical Transaction — Pizza
Since Satoshi Nakamoto mined the Bitcoin genesis block on January 3, 2009, Bitcoin has only been spread among the small crowd and has not realized its value.
Not until May 22, 2010, Bitcoin enthusiast “Laszlo Hanyecz” bought a pizza coupon worth $25 with 10,000 bitcoins. This is the first public bitcoin-physical transaction. Bitcoin has its price with 0.3 cents per bitcoin.


This day has also become the famous “Bitcoin Pizza Day” in Bitcoin history. Bitcoin as the imagination of the financial system has more practical significance. The tenth anniversary is coming. How will you commemorate it? Will you buy a pizza?
The First Digital Asset Exchange — Bitcoinmarket.com
After the birth of Bitcoin, in addition to mining, the only way to get Bitcoin in the early days was to conduct transactions on forums or IRC (commonly known as Internet Relay Chat). However, this method involves both long transaction time and great security risk.
In March 2010, the first digital asset exchange — Bitcoinmarket.com launched. However, due to lack of liquidity and transaction depth, it disappeared soon after its establishment, but Bitcoinmarket.com opened the era of the operation of the cryptocurrency realm exchange 1.0.


On June 9, 2011, China’s first Bitcoin exchange — Bitcoin China (BTCChina) launched. Its founder, Yang Linke, translated Bitcoin into Chinese “比特币” for the first time. In 2013, China’s bitcoin trading entered the golden age, and exchanges sprung up. China monopolized more than 90% of the world’s bitcoin transactions. Now, if the top three exchanges Binance, Huobi Global, OKEx are the Exchange 2.0, then the index exchange represented by 58COIN called the 3.0 version, leading the trend.
The First Generation of High-Performance Miner — ASIC Miner
When Satoshi Nakamoto created Bitcoin, the only way to get it is to use computers (including home computers) to mine, mainly relying on the CPU to calculate. However, as the value of digital currencies such as Bitcoin has become higher and higher, mining has become an industry with the competition is getting fiercer, accompanied by increasing difficulty of mining. Therefore, hardware performance competition starts.
In July 2012, the genius Jiang Xinyu (Internet nickname is “Friedcat”) from the junior class of the University of Science and Technology declared at the forum that he could make ASIC miners (chips). As far as mining computing power is concerned, ASICs can be tens of thousands or more higher than the same-generation CPUs and GPUs.
At the beginning of 2013, Zhang Nanqian (Pumpkin Zhang), a suspended doctoral student from the Beijing University of Aeronautics and Astronautics, developed the ASIC miner and named it “Avalon”.


In June 2013, the Friedcat’s miner USB was finally released, and it maintained 20% of the computing power of the entire network.
At the end of 2013, Wu Jihan, used the tens of millions yuan earned from Friedcat through investment, worked together with Jenke group, to develop the Antminer S1. Since then, the miner manufacturer Bitmain began to enter the stage of history.
It is no exaggeration to say that Friedcat and Zhang Nangeng have opened the domestic “mining” era.
The Birthplace of China’s Bitcoin — Garage Coffee
It is not only the “old objects” that record history, but also a place that everyone in the cryptocurrency realm aspires to.
Guo Hongcai once said, “Without no The Garage Café, there will be no cryptocurrency realm today. Since it is a very mysterious place that all waves of people from the café joint together to create today’s digital asset industry.

▲ In March 2013, American student Jake Smith successfully purchased a cup of coffee at The Garage Café with 0.131 bitcoins. This move attracted the attention of CCTV, and it conducted an interview.
Indeed, The Garage Café is the world ’s first entrepreneurial-themed coffee shop. It has been legendary since its establishment in 2011. The Garage Cafét is not only the core coordinate on China’s Bitcoin map, but also the birthplace of the Chinese cryptocurrency circle, where digital asset realm tycoons including Guo Hongcai, Zhao Dong, Li Xiaolai, Li Lin have made their ways.
The development of digital currency is only 11 years old. Through these “old objects”, we review the various stories of this wave of technology together, hoping to help you understand the development process of the digital currency field. Meanwhile, I also remind all practitioners to use history as a mirror and forge ahead.
Website: https://www.58ex.com/
Twitter: https://twitter.com/58_coin
Facebook: https://www.facebook.com/coin.58COIN
Telegram: https://t.me/official58
Medium: https://medium.com/@58coin_blog/
submitted by 58CoinExchange to u/58CoinExchange [link] [comments]

The 8 Skills to Be a Good Miner

Many people may feel quite confused about their low profit now. Maybe you forget to think about the small details when you are mining. Small little details will make big difference in your final income.
Now, i want to share you the 8 skills to improve your benefits.
1, Get a cheaper power
Everyone knows the power is the most charge in mining, if we can find a cheaper electricity, it will be good. So, how to get a cheaper electricity?
55% of the mining is in China, and 40% of the mining is in Sichuan China. Why? Because there are many hydroelectric power station in there. So, you can find a place near the station and get a cheaper electricity from them.
If you can find free electricity, it is the best anyway
2, Choose low w/t machine
As you know, low comsuption machine is very popular those days, like S17 pro 53t, T17 42t. They are 7nm technical, the w/t is low and it can even overclock, it maybe a good choice. Also, we need to consider the price of machine.
Cheap price machine means fast ROI, But low W/T machine has a bright future.
3, Buy miner when BTC begin to raise after long drop
When BTC price keep falling, of course the machine will be cheaper and cheaper. When the BTC price begin to raise, we can buy miner at that time, because the price is the cheapset and you can earn money back soon.
Normally at that time, the good machine will be sold out quickly, when the market feedback that those machine are good, you may be late to get the chance. So, make your plan for purchasing before, when price down, get them.
4, Do not forget BCH, BSV, ZEN coin
Do remember SHA-256 Algorithm can mining BCH and BSV as well. Sometimes those coin may get even a better profits than BTC.
Some miner has auto setting for BTC, but you can choose BSV and BCH mining if you set it,
5, Notice the half reward period information
Because the half reward time is coming in 2020, there will be a chance or a risk for it. Many low hashrate machine may be out of the style and high hashrate will be more competitive.
Low your risk and not to buy those cheap machine now
6, Choose a good future crypto currency
There are many coins in this field now, we need to analyse and find a better direction for mining. Like Z11, many people use it for ZEN mining nowadays, and their benefits is top now.
Also, people buy many S17, it can earn money back before next year half reward time. And they believe the BTC price will increase creazily as last two times.
7, Make plan for your selling of coin or machine
As you know, the price of the BTC changes everytime, we can mining the BTC first and keep it in hand, do not sell it every day. It is very stupid. Just sell it when price high, you do not need to take any risk if you do not buy BTC directy. We do not need to care about the low price situation, we only need to wait. When chance come, get it.
Same for machine
8. Don't be fooled by the mining calculator
Many sites calculate mining profits based on hardware and electricity prices. If you've never mined before, you might be happy to see the numbers provided by these websites and calculators and think, "I'll make a fortune!"
However, these websites don't tell you: in addition to the cost of electricity, there may be other current costs, such as maintenance, cooling, rent, labor, etc. Generally, the hash rate and power consumption of the device are slightly different from what the factory says.
This difference is more common in unpopular brands. You can better understand the actual hash rate and the actual power consumption by watching the miner test video on YouTube. In addition, depending on the distance from the meter to the device and the type of cable used, the power loss from the meter to the device can be as high as 200 watts.
In addition to the cost of mining machines, some initial costs are required to prepare the infrastructure, such as cooling and venting, cabling and distribution, shelves, network and monitoring equipment, safety measures, etc.
The network difficulty is constantly changing and increasing at a significant speed, which directly affects the mining revenue. You can check the bitcoin network difficulty chart to see its growth rate, but your miner will not always be 100% active.
Due to maintenance, network problems, ore pool problems, power problems and many other problems, the miner may be offline for several hours. I suggest that you consider setting the normal operation time of the miner to less than 97% when calculating. We have rich mining experience in professional ore pools, and the normal operation time of these mining machines will not exceed 97-98%.
Thats all, hope those information will help you become a good mining investor.
submitted by 15Ansel to BitcoinMining [link] [comments]

In depth interview with Mr. Feng: MW is not only the commercial incentive layer of IPFS

Why did early bitcoin players play MW?Why is blockchain + distributed storage the only industry that can combine mining with practice?How can human beings do things beneficial to social storage while consuming a lot of resources?Special guest Mr. Feng: early believers in bitcoin, co-founder ofMirror World Network MW, to solve our doubts one by one!
Hello, I'm Mr.Feng. I started my business in 2012.I'm an early believer in bitcoin. After two years of silence, I returned to the industry with the help of a group of friends.During this period, a lot of research has been done on distributed storage, including IPFS. So this project is also about the field of distributed storage. I think blockchain + distributed storage is the only industry that can combine mining with practice. While human beings consume a lot of resources, it can also be a commercial storage network beneficial to society. MW is a mature landing project that applies IPFS technology to actual storage, and creates a feasible solution for the landing of blockchain industry.
1. I believe that after your self introduction, many audience friends are concerned about what mirror network is doing recently. Can you share it with audience friends?What achievements have mirror network made in these years?
MW is building an easy-to-use and available distributed storage network, which creates a new computing paradigm and collaboration mode of low-cost trust building in an untrusted competitive environment.We have three years of technical precipitation and have drawn on the experience of IPFS, Alibaba cloud OSS, stoij and other technologies at home and abroad. At present, the code base is close to 900000 lines, and it will also be open-source in the future. Before that, we have run a relatively stable internal test network, and many friends have participated in it.In the next three to four months, we will release our technical achievements, including practical cases, and leave a message for you.
2.What the difference between MW and filecoin?What is the core competitiveness of MW?
I believe you have known IPFS for a long time. Filecoin is the incentive layer of IPFS. To put it simply, it is the financing tool of IPFS. It was dazzling at that time.At the same time, storj and SIA did well.We also chose this way at that time. The original intention of MW is to do real distributed storage. I think MW different from filecoin in terms of starting point. The core competitiveness of MW is technology inclusiveness. We integrate decentralized storage protocol and centralized storage protocol to solve the game between decentralized storage protocol, regulatory layer and practicability.
3.Now, MW public chain has been able to apply IPFS technology to actual storage very mature, and it is the only one.what kind of difficulties did MW encounter and how did you solve them?
In January, we asked for opinions on a small scale in the industry. At that time, we fully demonstrated the storage function and blockchain information, which was unanimously recognized by everyone.Because the team is mainly technical members, the economic model should be the biggest difficulty. After extensive collection of opinions, we adjusted it no less than ten times. Finally, we chose the open and inclusive community governance scheme. There is no model, fair competition, and community motivation is our goal.
4.MW will open the test network on April 18,How should interested users participate in the test?Is there a reward for the test?
Yes, the public beta will be officially launched on April 18, 2020. It will be divided into three stages: pioneer, union and world. You can go to mw.run see the road map , there is a threshold at the earliest stage of the pioneer stage. We need to manually authenticate the added equipment to ensure the stability and robustness of the initial stage of the network, but there will be no block reward, only contribution reward provided by the foundation.After the network is stable, we will open up the block reward and enter the computing power contest period. There will be rewards in the whole public beta stage. You only need to send an email to: [email protected] to apply for joining.
5. Distributed storage mining has always been a concern of miners. What should be paid attention to when mining in MW test network?What are the requirements for mining machines?What factors will affect the mining revenue?
I like to share with you the consensus mechanism of MW. In order to make more storage devices join the MW ecosystem more fairly, and further increase the number of stable nodes in the network to improve the network dispersion, MW adopts DPoS consensus and POC consensus based on weight table.How to understand this? In fact, MW is a very inclusive project. Simply speaking, it is as simple as bitcoin mining through the competition for computing power!We have a set of strict weight calculation and distribution mechanism, which is equivalent to the law of the whole network. It will be announced in genesis block. At present, the size of storage space has the most direct impact on the income. In addition, we have a unique mining pool system, where everyone can establish a mining pool and participate in mining dividends together without having equipment.We don't have too many requirements for mining machines. At present, we only have requirements for network environment, and we need public IP.
6.In your opinion, what is the real "visual" IPFS storage system?How MW achieve "availability" and "ease of use" when building a distributed file storage network?
"Visualization" is actually very easy to understand, that is, it can be seen and felt.Now we have developed a complete visual storage path, and MW is a typical representative of visual storage.Here I highly recommend our internal measurement network that you experience. Like using a network disk, your files can be segmented, hashed and encrypted after uploading. Finally, they can be completely recovered and downloaded. We also made a short tutorial, which can be watched and understood by interested friends.In addition, our goal is to make the IPFS distributed storage system available to all ordinary people, rather than setting too high a threshold, which represents ease of use and availability.
If you want to participate in the internal test, you can contact us before April 18 to register. We will also provide 1000 coins for free.After the test network was officially launched on April 18, all data of the internal test will be reset.
7. IPFS commercial incentive layer, but also what value can MW provide us?What is the ultimate vision of the MW?
MW network can do the following:
a.establish an open distributed blockchain storage network, form a multi chain ecology with existing networks and public chains, and complete data and value transmission.
b.set up a component distributed storage network with idle storage resources in the enterprise and individuals, and deploy various public chains, storage networks and individual nodes.
c.build a global distributed cloud storage compatible with IPFS, public cloud storage and private cloud storage.
Secondly, we need to talk about our collective chain architecture. In the public chain part, MW is an open blockchain + distributed storage system, which mainly provides benefits for the miners and maintains the stability of the network. We will also make an alliance chain in China. MW will become an application network of small distributed data center, providing users with low cost, security and high private storage services can also be used to supervise and audit enterprises and governments in some specific fields or scenarios.
Finally, we can provide data backup, verification and query services for other public chain (open source chain) data.MW is an underlying system focusing on distributed storage.
8.Security has always been a key concern of people. In terms of data security, how does the mirror network ensure data security?
Data loss and privacy are the focus of data storage. I'm sure you have heard a lot of news, including customer information disclosure, downtime, server crash, selling customer privacy and so on. In fact, this is some of the problems that central storage will face. With the continuous growth of data and the improvement of people requirements for data security, the data storage mode is also generating iterations, and IPFS protocol is a very good solution to the privacy processing,On the basis of IPFS, MW also uses technologies such as file segmentation, multi backup, encryption, multi role, data correction and deletion to ensure the data security of users.One of the simplest understandings is that we will always copy three file fragments automatically in the network node to ensure the data security.
9.With the advent of 5g cloud computing era, people have new requirements for bandwidth and traffic. How is the layout of MW?How will distributed storage develop in the future?
We are full of expectations for 5g era, which is one of the reasons why we have only launched MW until now. The small distributed storage computing center close to users is more suitable for the needs of the future era. We will set up a demonstration data center in the public test network, and conduct commercial demonstration for the storage space provided by enterprises.
At the end of last year, I read a research report jointly issued. By 2023, the data storage volume will be twice that of 2019. At present, the industry is in a high-speed development stage, in which distributed storage will enter the mainstream storage market. We have planned a three-year development path, starting with cold data, such as archived data, infrequently called data, etc., public chain miner Hosting as a data center is the business model of our alliance chain. Compared with the traditional data center or cloud, we have a natural price advantage. We can even achieve 10% of the price of Tencent cloud and Alibaba cloud equivalent products. When 5g / 6G is mature, we will enter the mainstream storage market.
In the future, we also hope that global storage, open-source public chains, and enterprises and individuals with storage resources can join Mirror World Network to provide a solid infrastructure for future storage methods, and obtain appropriate rewards.
submitted by MirrorWorldNetwork to u/MirrorWorldNetwork [link] [comments]

How to Assess the Value behind Cryptocurrencies

How to Assess the Value behind Cryptocurrencies

https://preview.redd.it/5ihj6bi79dr41.jpg?width=6480&format=pjpg&auto=webp&s=babfa7255e9c017508197ec0bc7319d74b1050c9
Many of the investors and financial institutions that I talk to are hesitant to invest in cryptocurrencies, often saying that they can’t determine their real value. For example, if we were looking to buy equity in a company, we could look at its fundamentals and make a prudent decision about whether to invest in it or not. Crypto is different in that, it is in its early days and cannot present evidence of a long track record.
Admittedly, the process of value assessment may not be as straightforward for cryptocurrencies as for some of the more traditional asset classes. However, we can still refer to certain other drivers to help us form an assessment of value.
Let’s start with the original cryptocurrency, Bitcoin, and discuss how it compares to gold and commodities.
Valuing Bitcoin — Stock to Flow Ratio
Bitcoin is often valued using the stock to flow ratio, which quantifies the “hardness” of an asset. A report by Bayerische Landesbank found that:
“Applied to Bitcoin, an unusually strong correlation emerges between the market value of this cryptocurrency and the ratio between existing stockpiles of Bitcoin (“stock”) and new supply (“flow”).”
The book “The Bitcoin Standard” by Saifedean Ammous introduced the stock-to-flow approach in relation to valuing Bitcoin. The supply of Bitcoin can be engineered at will. Satoshi set into the protocol a drastic decline in supply growth (due to halving every 4 years). Price is decoupled from mining efforts, so as the price rises, the difficulty of mining Bitcoin increases. Subsequently, new supply, or flow, correspondingly reduces.
The supply profile is guaranteed by the existing setup — if the supply profile were to change, it would adversely affect the peer to peer network that holds bitcoin and dilute the value of their coins.
As a comparison, the stock to flow ratio is the way gold is valued. Gold is used as a store of value in hard times. The supply of gold cannot be increased in huge quantities, and the annual production of fresh gold (“flow”) is limited, adding only incrementally to the existing stockpile (“stock”). So gold is described as having a high stock to flow ratio. However much the price of gold increases, the amount produced will not be increased exponentially, which would dilute the stock to flow ratio.
The next Bitcoin halving is due to take place in May 2020, potentially hugely increasing the stock to flow ratio of Bitcoin. It will be interesting to see what that does to the Bitcoin price.
Valuing according to utility
A cryptocurrency must have a strong use case to incentivize people to have the coins. How useful a coin is feeds through to the value of the coin.
If we take the example of Ether, in order to execute commands and develop applications in the Ethereum blockchain, you need to own Ether. The Ether is converted into “Gas”, which is used to run the network. Ether is, therefore, the currency used to drive transactions and development on the Ethereum blockchain. The more people that are transacting with and on Ethereum, the greater the demand for Ether becomes, eventually leading to a price increase.
“Users will use the infrastructure that offers them the applications they need. And yes, at the moment this is clearly Ethereum. There are more Apps and smart contracts deployed on Ethereum than on all other application-focused blockchain protocols put together.” Max Lautenschläger, Managing Partner, Iconic Holding
Therefore, price of utility protocols is contingent upon the community engaging them and adoption of applications built on top of them. As long as they continue to build and adopt, because it is useful for them, the growing utility that will continue to drive value.
There are many other types of cryptocurrencies and crypto assets, as my colleague highlighted in a recent article. Crypto may be in its early stages and be extremely volatile, but traditionally-minded investors and financial institutions can rest easy knowing there are standard ways through which value can be calculated.
# # #
This article is strictly for educational purposes and isn’t to be construed as financial advice.
By Sara Sabin, Business Development, Iconic Holding
submitted by IconicLab to u/IconicLab [link] [comments]

Threshold Signature Explained— Bringing Exciting Applications with TSS

Threshold Signature Explained— Bringing Exciting Applications with TSS
— A deep dive into threshold signature without mathematics by ARPA’s cryptographer Dr. Alex Su

https://preview.redd.it/cp0wib2mk0q41.png?width=757&format=png&auto=webp&s=d42056f42fb16041bc512f10f10fed56a16dc279
Threshold signature is a distributed multi-party signature protocol that includes distributed key generation, signature, and verification algorithms.
In recent years, with the rapid development of blockchain technology, signature algorithms have gained widespread attention in both academic research and real-world applications. Its properties like security, practicability, scalability, and decentralization of signature are pored through.
Due to the fact that blockchain and signature are closely connected, the development of signature algorithms and the introduction of new signature paradigms will directly affect the characteristics and efficiency of blockchain networks.
In addition, institutional and personal account key management requirements stimulated by distributed ledgers have also spawned many wallet applications, and this change has also affected traditional enterprises. No matter in the blockchain or traditional financial institutions, the threshold signature scheme can bring security and privacy improvement in various scenarios. As an emerging technology, threshold signatures are still under academic research and discussions, among which there are unverified security risks and practical problems.
This article will start from the technical rationale and discuss about cryptography and blockchain. Then we will compare multi-party computation and threshold signature before discussing the pros and cons of different paradigms of signature. In the end, there will be a list of use cases of threshold signature. So that, the reader may quickly learn about the threshold signature.
I. Cryptography in Daily Life
Before introducing threshold signatures, let’s get a general understanding of cryptography. How does cryptography protect digital information? How to create an identity in the digital world? At the very beginning, people want secure storage and transmission. After one creates a key, he can use symmetric encryption to store secrets. If two people have the same key, they can achieve secure transmission between them. Like, the king encrypts a command and the general decrypts it with the corresponding key.
But when two people do not have a safe channel to use, how can they create a shared key? So, the key exchange protocol came into being. Analogously, if the king issues an order to all the people in the digital world, how can everyone proves that the sentence originated from the king? As such, the digital signature protocol was invented. Both protocols are based on public key cryptography, or asymmetric cryptographic algorithms.


“Tiger Rune” is a troop deployment tool used by ancient emperor’s, made of bronze or gold tokens in the shape of a tiger, split in half, half of which is given to the general and the other half is saved by the emperor. Only when two tiger amulets are combined and used at the same time, will the amulet holder get the right to dispatch troops.
Symmetric and asymmetric encryption constitute the main components of modern cryptography. They both have three fixed parts: key generation, encryption, and decryption. Here, we focus on digital signature protocols. The key generation process generates a pair of associated keys: the public key and the private key. The public key is open to everyone, and the private key represents the identity and is only revealed to the owner. Whoever owns the private key has the identity represented by the key. The encryption algorithm, or signature algorithm, takes the private key as input and generate a signature on a piece of information. The decryption algorithm, or signature verification algorithm, uses public keys to verify the validity of the signature and the correctness of the information.
II. Signature in the Blockchain
Looking back on blockchain, it uses consensus algorithm to construct distributed books, and signature provides identity information for blockchain. All the transaction information on the blockchain is identified by the signature of the transaction initiator. The blockchain can verify the signature according to specific rules to check the transaction validity, all thanks to the immutability and verifiability of the signature.
For cryptography, the blockchain is more than using signature protocol, or that the consensus algorithm based on Proof-of-Work uses a hash function. Blockchain builds an infrastructure layer of consensus and transaction through. On top of that, the novel cryptographic protocols such as secure multi-party computation, zero-knowledge proof, homomorphic encryption thrives. For example, secure multi-party computation, which is naturally adapted to distributed networks, can build secure data transfer and machine learning platforms on the blockchain. The special nature of zero-knowledge proof provides feasibility for verifiable anonymous transactions. The combination of these cutting-edge cryptographic protocols and blockchain technology will drive the development of the digital world in the next decade, leading to secure data sharing, privacy protection, or more applications now unimaginable.
III. Secure Multi-party Computation and Threshold Signature
After introducing how digital signature protocol affects our lives, and how to help the blockchain build identities and record transactions, we will mention secure multi-party computation (MPC), from where we can see how threshold signatures achieve decentralization. For more about MPC, please refer to our previous posts which detailed the technical background and application scenarios.
MPC, by definition, is a secure computation that several participants jointly execute. Security here means that, in one computation, all participants provide their own private input, and can obtain results from the calculation. It is not possible to get any private information entered by other parties. In 1982, when Prof. Yao proposed the concept of MPC, he gave an example called the “Millionaires Problem” — two millionaires who want to know who is richer than the other without telling the true amount of assets. Specifically, the secure multiparty computation would care about the following properties:
  • Privacy: Any participant cannot obtain any private input of other participants, except for information that can be inferred from the computation results.
  • Correctness and verifiability: The computation should ensure correct execution, and the legitimacy and correctness of this process should be verifiable by participants or third parties.
  • Fairness or robustness: All parties involved in the calculation, if not agreed in advance, should be able to obtain the computation results at the same time or cannot obtain the results.
Supposing we use secure multi-party computation to make a digital signature in a general sense, we will proceed as follows:
  • Key generation phase: all future participants will be involved together to do two things: 1) each involved party generates a secret private key; 2) The public key is calculated according to the sequence of private keys.
  • Signature phase: Participants joining in a certain signature use their own private keys as private inputs, and the information to be signed as a public input to perform a joint signature operation to obtain a signature. In this process, the privacy of secure multi-party computing ensures the security of private keys. The correctness and robustness guarantee the unforgeability of the signature and everyone can all get signatures.
  • Verification phase: Use the public key corresponding to the transaction to verify the signature as traditional algorithm. There is no “secret input” during the verification, this means that the verification can be performed without multi-party computation, which will become an advantage of multi-party computation type distributed signature.
The signature protocol constructed on the idea of ​​secure multiparty computing is the threshold signature. It should be noted that we have omitted some details, because secure multiparty computing is actually a collective name for a type of cryptographic protocol. For different security assumptions and threshold settings, there are different construction methods. Therefore, the threshold signatures of different settings will also have distinctive properties, this article will not explain each setting, but the comparative result with other signature schemes will be introduced in the next section.
IV. Single Signature, Multi-Signature and Threshold Signature
Besides the threshold signature, what other methods can we choose?
Bitcoin at the beginning, uses single signature which allocates each account with one private key. The message signed by this key is considered legitimate. Later, in order to avoid single point of failure, or introduce account management by multiple people, Bitcoin provides a multi-signature function. Multi-signature can be simply understood as each account owner signs successively and post all signatures to the chain. Then signatures are verified in order on the chain. When certain conditions are met, the transaction is legitimate. This method achieves a multiple private keys control purpose.
So, what’s the difference between multi-signature and threshold signature?
Several constraints of multi-signature are:
  1. The access structure is not flexible. If an account’s access structure is given, that is, which private keys can complete a legal signature, this structure cannot be adjusted at a later stage. For example, a participant withdraws, or a new involved party needs to change the access structure. If you must change, you need to complete the initial setup process again, which will change the public key and account address as well.
  2. Less efficiency. The first is that the verification on chain consumes power of all nodes, and therefore requires a processing fee. The verification of multiple signatures is equivalent to multiple single signatures. The second is performance. The verification obviously takes more time.
  3. Requirements of smart contract support and algorithm adaptation that varies from chain to chain. Because multi-sig is not naturally supported. Due to the possible vulnerabilities in smart contracts, this support is considered risky.
  4. No anonymity, this is not able to be trivially called disadvantage or advantage, because anonymity is required for specific conditions. Anonymity here means that multi-signature directly exposes all participating signers of the transaction.
Correspondingly, the threshold signature has the following features:
  1. The access structure is flexible. Through an additional multi-party computation, the existing private key sequence can be expanded to assign private keys to new participants. This process will not expose the old and newly generated private key, nor will it change the public key and account address.
  2. It provides more efficiency. For the chain, the signature generated by the threshold signature is not different from a single signature, which means the following improvements : a) The verification is the same as the single signature, and needs no additional fee; b ) the information of the signer is invisible, because for other nodes, the information is decrypted with the same public key; c) No smart contract on chain is needed to provide additional support.
In addition to the above discussion, there is a distributed signature scheme supported by Shamir secret sharing. Secret sharing algorithm has a long history which is used to slice information storage and perform error correction information. From the underlying algorithm of secure computation to the error correction of the disc. This technology has always played an important role, but the main problem is that when used in a signature protocol, Shamir secret sharing needs to recover the master private key.
As for multiple signatures or threshold signature, the master private key has never been reconstructed, even if it is in memory or cache. this short-term reconstruction is not tolerable for vital accounts.
V. Limitations
Just like other secure multi-party computation protocols, the introduction of other participants makes security model different with traditional point-to-point encrypted transmission. The problem of conspiracy and malicious participants were not taken into account in algorithms before. The behavior of physical entities cannot be restricted, and perpetrators are introduced into participating groups.
Therefore, multi-party cryptographic protocols cannot obtain the security strength as before. Effort is needed to develop threshold signature applications, integrate existing infrastructure, and test the true strength of threshold signature scheme.
VI. Scenarios
1. Key Management
The use of threshold signature in key management system can achieve a more flexible administration, such as ARPA’s enterprise key management API. One can use the access structure to design authorization pattern for users with different priorities. In addition, for the entry of new entities, the threshold signature can quickly refresh the key. This operation can also be performed periodically to level up the difficulty of hacking multiple private keys at the same time. Finally, for the verifier, the threshold signature is not different from the traditional signature, so it is compatible with old equipments and reduces the update cost. ARPA enterprise key management modules already support Elliptic Curve Digital Signature Scheme secp256k1 and ed25519 parameters. In the future, it will be compatible with more parameters.

https://preview.redd.it/c27zuuhdl0q41.png?width=757&format=png&auto=webp&s=26d46e871dadbbd4e3bea74d840e0198dec8eb1c
2. Crypto Wallet
Wallets based on threshold signature are more secure because the private key doesn’t need to be rebuilt. Also, without all signatures posted publicly, anonymity can be achieved. Compared to the multi-signature, threshold signature needs less transaction fees. Similar to key management applications, the administration of digital asset accounts can also be more flexible. Furthermore, threshold signature wallet can support various blockchains that do not natively support multi-signature, which reduces the risk of smart contracts bugs.

Conclusion

This article describes why people need the threshold signature, and what inspiring properties it may bring. One can see that threshold signature has higher security, more flexible control, more efficient verification process. In fact, different signature technologies have different application scenarios, such as aggregate signatures not mentioned in the article, and BLS-based multi-signature. At the same time, readers are also welcomed to read more about secure multi-party computation. Secure computation is the holy grail of cryptographic protocols. It can accomplish much more than the application of threshold signatures. In the near future, secure computation will solve more specific application questions in the digital world.

About Author

Dr. Alex Su works for ARPA as the cryptography researcher. He got his Bachelor’s degree in Electronic Engineering and Ph.D. in Cryptography from Tsinghua University. Dr. Su’s research interests include multi-party computation and post-quantum cryptography implementation and acceleration.

About ARPA

ARPA is committed to providing secure data transfer solutions based on cryptographic operations for businesses and individuals.
The ARPA secure multi-party computing network can be used as a protocol layer to implement privacy computing capabilities for public chains, and it enables developers to build efficient, secure, and data-protected business applications on private smart contracts. Enterprise and personal data can, therefore, be analyzed securely on the ARPA computing network without fear of exposing the data to any third party.
ARPA’s multi-party computing technology supports secure data markets, precision marketing, credit score calculations, and even the safe realization of personal data.
ARPA’s core team is international, with PhDs in cryptography from Tsinghua University, experienced systems engineers from Google, Uber, Amazon, Huawei and Mitsubishi, blockchain experts from the University of Tokyo, AIG, and the World Bank. We also have hired data scientists from CircleUp, as well as financial and data professionals from Fosun and Fidelity Investments.
For more information about ARPA, or to join our team, please contact us at [email protected].
Learn about ARPA’s recent official news:
Telegram (English): https://t.me/arpa_community
Telegram (Việt Nam): https://t.me/ARPAVietnam
Telegram (Russian): https://t.me/arpa_community_ru
Telegram (Indonesian): https://t.me/Arpa_Indonesia
Telegram (Thai): https://t.me/Arpa_Thai
Telegram (Philippines):https://t.me/ARPA_Philippines
Telegram (Turkish): https://t.me/Arpa_Turkey
Korean Chats: https://open.kakao.com/o/giExbhmb (Kakao) & https://t.me/arpakoreanofficial (Telegram, new)
Medium: https://medium.com/@arpa
Twitter: u/arpaofficial
Reddit: https://www.reddit.com/arpachain/
Facebook: https://www.facebook.com/ARPA-317434982266680/54
submitted by arpaofficial to u/arpaofficial [link] [comments]

Jiangzhuoer: CSW's Three Extreme Claims - [BitKan 1v1] Craig Wright vs Jiangzhuoer

Jiangzhuoer: CSW's Three Extreme Claims - [BitKan 1v1] Craig Wright vs Jiangzhuoer
Digest from [BitKan 1v1] debate.
bitkan.pro aggregates all trading depth of Binance Huobi and OKEx. or Try our APP!
https://preview.redd.it/ohaz6a5lkoc31.png?width=1058&format=png&auto=webp&s=826957a79fe4fa6e66f2565cbe265cc5e7c3b772
Question 2: During the BCH fork to BSV hash war, why do you support BCH? What do you think of the differences between BSV and BCH?
Jiang: First of all, we have to figure out how did some of the key propositions of BSV came about. CSW seems to be the leader of the BSV community, but in fact CSW is just a chess piece. For example, CSW is in name the chief scientist of Nchain, but CSW has no shares in a series of BSV related companies such as Nchain, Coingeek etc. The true boss of BSV and the main backer behind CSW is Calvin Ayre, the casino tycoon.
Zhao Nan wrote two articles, which made the cause and effect of CA's capital layout clear:
"The capital layout of the casino tycoon Calvin Ayre" >>(Chinese)
"The ins and outs of the Calvin Ayre team" >>(Chinese)
Therefore, the ultimate goal of Calvin Ayre is to make money from the Canadian stock market through Coingeek. Coingeek develops its own mining machine, mines itself, controls the chain of BSV, and has the "CSW" as the gimmick, to tell us the story of BSV.

So BCH forks the BSV, which is a step in the entire capital layout of Calvin Ayre. It is not because there is any irreconcilable development direction, but because Coingeek needs to control the BCH. If it cannot be controlled, it will split into a chain that Coingeek can control completely. The whole thing is planned in advance, for example, bitcoinsv.org registration date is July 2, 2018, bitcoinsv.io is August 16, long before CSW began firing shots at ABC team.
CSW’s goal is to split the BSV from the BCH, so he must overstate many of his claims in order to create a split. If he puts forward a reasonable claim and BCH is a rational and pragmatic community, then he can't split. It is important to mention some very extreme claims that the BCH community can't accept, and then incite some community members through extremist claims, just like the Nazis do extreme propaganda and incitement, in order to split from the BCH.

CSW's extreme claims, such as:
1 Super block: BCH advocates large block expansion. What about CSW? He demands to upgrade the oversized block in a short time. The BCH 32MB block is sufficient and does not exceed the network load. CSW exerts that he will upgrade 128MB now. He will not wait till next year, and he intends to upgrade to 2g as well in 2019.
But the result? Don't even talk about 2G, the 100M block has exceeded the current network carrying capacity. After the BSV, because the block is too large, it is too late to spread across the entire network. There have been many deep rollbacks, April 18, 2019. At that time, the 578640 height 128M block resulted in 6 confirmed rollbacks, making the 6 confirmations unreliable.
On April 18, 2019, Beijing time, from 21:00 to 22:00, the deep recombination of up to six blocks occurred in the cobwebs of BSV (block height 578640-578645)

https://preview.redd.it/7winlisnkoc31.png?width=1124&format=png&auto=webp&s=1c766e14d6360f869006b918b3e7d2a25b9b5fe4
According to BitMEX Research, the BSV chain was rolled back by two blocks in the week. One of the orphaned blocks was about 62.6MB in size. This large block may be the cause of the roll back. In addition, BSV plans to launch an upgraded network called Quasar on July 24. The only change to this upgrade is to increase the default block size limit. It is reported that the expansion of block capacity will increase the probability of block reorganization: the large block has not yet been packaged, and multiple small blocks have made the block height overtaking, which will lead to block reorganization or even fork.

2 Lock-up agreement: A chain must stabilize the agreement. The agreement is greatly changed every time. It definitely affects the above development. If CSW proposes a stable agreement, then everyone agrees that he can't split it. What should he do? CSW is even more extreme, and I am going to set the protocol and lock it, even back to the original version of Bitcoin, which is ridiculous.
The environment has changed, and the agreement must change. For example, if the 0.1 version of Bitcoin is perfect, and the 14-day difficulty adjustment is not a defect, the BSV will not remove the BCH “not original” DDA difficulty adjustment algorithm, and switch back to 14 Day difficulty adjustment? Because once the BSV removes the BCH DDA difficulty adjustment algorithm, it will be directly cut and killed by the big calculation.

3 Computing power determines everything: Why does CW have the power to decide everything? Because the extremes did not dominate the community at the time, but CA's coingeek deployed a lot of mining machines to mine, which is very computationally intensive, so he advocated Force to decide everything, of course, he did not know that my calculations were more than him. I will talk about this later.
Because these claims are created for splitting, not natural development, so these claims will be internal contradictions. For example, CSW said that the agreement is to be locked, and that the computing power determines everything. Even decided to increase the total amount of 21 million, then who has the final say?

Why don't I support the development path of BSV? Because these extreme claims of CSW are all for the purpose of splitting, purposefully proposed, whether it is a large block, lock-up agreement, power calculation determines everything, in fact, it can not be implemented, of course, Will not support these extreme claims that can't actually fall.
In addition, these extreme claims will become a heavy liability for the development of BSV in the future. It is necessary to develop according to these extreme claims. In fact, we cannot do this. We must revise these extreme claims. The members of the community who were incited by these extreme claims will definitely not do it. Then, how do you say that BSV is still developing?

Digest from [BitKan 1v1] debate.
bitkan.pro aggregates all trading depth of Binance Huobi and OKEx. or Try our APP!
submitted by BitKan to btc [link] [comments]

Constructing an Opt-In alternative reward for securing the blockchain

Since a keyboard with a monero logo got upvoted to the top I realized I should post various thoughts I have and generate some discussion. I hope others do the same.
Monero is currently secured by a dwindling block reward. There is a chance that the tail emission reward + transaction fees to secure the blockchain could become insufficient and allow for a scenario where it is profitable for someone to execute a 51% attack.
To understand this issue better, read this:
In Game Theory, Tragedy of the Commons is a market failure scenario where a common good is produced in lower quantities than the public desires, or consumed in greater quantities than desired. One example is pollution - it is in the public's best interest not to pollute, but every individual has incentive to pollute (e.g. because burning fossil fuel is cheap, and individually each consumer doesn't affect the environment much). The relevance to Bitcoin is a hypothetical market failure that might happen in the far future when the block reward from mining drops near zero. In the current Bitcoin design, the only fees miners earn at this time are Transaction fees. Miners will accept transactions with any fees (because the marginal cost of including them is minimal) and users will pay lower and lower fees (in the order of satoshis). It is possible that the honest miners will be under-incentivized, and that too few miners will mine, resulting in lower difficulty than what the public desires. This might mean various 51% attacks will happen frequently, and the Bitcoin will not function correctly. The Bitcoin protocol can be altered to combat this problem - one proposed solution is Dominant Assurance Contracts. Another more radical proposal (in the sense that the required change won't be accepted by most bitcoiners) is to have a perpetual reward that is constant in proportion to the monetary base. That can be achieved in two ways. An ever increasing reward (inflatacoin/expocoin) or a constant reward plus a demurrage fee in all funds that caps the monetary base (freicoin). This scenario was discussed on several threads: - Tragedy of the Commons - Disturbingly low future difficulty equilibrium https://bitcointalk.org/index.php?topic=6284.0 - Stack Exchange http://bitcoin.stackexchange.com/questions/3111/will-bitcoin-suffer-from-a-mining-tragedy-of-the-commons-when-mining-fees-drop-t Currently there is no consensus whether this problem is real, and if so, what is the best solution. 
Source: https://en.bitcoin.it/wiki/Tragedy_of_the_Commons

I suspect that least contentious solution to it is not to change code, emission or artificially increase fees (which would actually undermine the tail emission and lead to other problems, I believe: https://freedom-to-tinker.com/2016/10/21/bitcoin-is-unstable-without-the-block-reward/) but rather use a Dominant Assurance Contract that makes it rational for those who benefit from Monero to contribute to the block reward.

Dominant assurance contracts
Dominant assurance contracts, created by Alex Tabarrok, involve an extra component, an entrepreneur who profits when the quorum is reached and pays the signors extra if it is not. If the quorum is not formed, the signors do not pay their share and indeed actively profit from having participated since they keep the money the entrepreneur paid them. Conversely, if the quorum succeeds, the entrepreneur is compensated for taking the risk of the quorum failing. Thus, a player will benefit whether or not the quorum succeeds; if it fails he reaps a monetary return, and if it succeeds, he pays only a small amount more than under an assurance contract, and the public good will be provided.
Tabarrok asserts that this creates a dominant strategy) of participation for all players. Because all players will calculate that it is in their best interests to participate, the contract will succeed, and the entrepreneur will be rewarded. In a meta-game, this reward is an incentive for other entrepreneurs to enter the DAC market, driving down the cost disadvantage of dominant assurance contract versus regular assurance contracts.
Monero doesn't have a lot of scripting options to work with currently so it is very hard for me to understand how one might go about creating a Dominant Assurance Contract using Monero, especially in regards to paying out to a miner address.
This is how it could work in Bitcoin:
https://en.bitcoin.it/wiki/Dominant_Assurance_Contracts
This scheme is an attempt at Mike Hearn's exercise for the reader: an implementation of dominant assurance contracts. The scheme requires the use of multisignature transactions, nLockTime and transaction replacement which means it won't work until these features are available on the Bitcoin network.
A vendor agrees to produce a good if X BTC are raised by date D and to pay Y BTC to each of n contributors if X BTC are not raised by date D, or to pay nY BTC if X BTC are raised and the vendor fails to produce the good to the satisfaction of 2 of 3 independent arbitrators picked through a fair process
The arbitrators specify a 2-of-3 multisignature script to use as an output for the fundraiser with a public key from each arbitrator, which will allow them to judge the performance on actually producing the good
For each contributor:
The vendor and the contributor exchange public keys
They create a 2-of-2 multisignature output from those public keys
With no change, they create but do not sign a transaction with an input of X/n BTC from the contributor and an input of Y BTC from the vendor, with X/n+Y going to the output created in 3.2
The contributor creates a transaction where the output is X+nY to the address created in step 2 and the input is the output of the transaction in 3.3, signs it using SIGHASH_ALL | SIGHASH_ANYONECANPAY, with version = UINT_MAX and gives it to the vendor
The vendor creates a transaction of the entire balance of the transaction in 3.3 to the contributor with nLockTime of D and version < UINT_MAX, signs it and gives it to the contributor
The vendor and contributor then both sign the transaction in 3.3 and broadcast it to the network, making the transaction in 3.4 valid when enough contributors participate and the transaction in 3.5 valid when nLockTime expires
As date D nears, nLockTime comes close to expiration.
If enough (n) people contribute, all of the inputs from 3.4 can combine to make the output valid when signed by the vendor, creating a valid transaction sending that money to the arbitrators, which only agree to release the funds when the vendor produces a satisfactory output
If not enough people ( Note that there is a limit at which it can be more profitable for the vendor to make the remaining contributions when D approaches
Now the arbitrators have control of X (the payment from the contributors) + nY (the performance bond from the vendor) BTC and pay the vendor only when the vendor performs satisfactorily
Such contracts can be used for crowdfunding. Notable examples from Mike Hearn include:
Funding Internet radio stations which don't want to play ads: donations are the only viable revenue source as pay-for-streaming models allow undercutting by subscribers who relay the stream to their own subscribers
Automatically contributing to the human translation of web pages


Monero has these features:
  1. Multisig
  2. LockTime (but it is much different then BTCs)
  3. A possibility to do MoJoin (CoinJoin) like transactions, even if less then optimally private. There is hope that the MoJoin Schemes will allow for better privacy in the future:
I have a draft writeup for a merged-input system called MoJoin that allows multiple parties to generate a single transaction. The goal is to complete the transaction merging with no trust in any party, but this introduces significant complexity and may not be possible with the known Bulletproofs multiparty computation scheme. My current version of MoJoin assumes partial trust in a dealer, who learns the mappings between input rings and outputs (but not true spends or Pedersen commitment data).

Additionally, Non-Interactive Refund Transactions could also be possible in Monero's future.
https://eprint.iacr.org/2019/595
I can't fully workout how all of these could work together to make a DAC that allows miners to put up and payout a reward if it doesn't succeed, or how we could make it so *any* miner who participated (by putting up a reward) could claim the reward if it succeeded. I think this should really be explored as it could make for a much more secure blockchain, potentially saving us if a "crypto winter" hits where the value of monero and number of transactions are low, making for a blockchain that is hard to trust because it would be so cheap to perform a 51% attack.


I am still skeptical of Dominant Assurance Contracts, despite success in an initial test https://marginalrevolution.com/marginalrevolution/2013/08/a-test-of-dominant-assurance-contracts.html
it still remains questionable or at least confusing: https://forum.ethereum.org/discussion/747/im-not-understanding-why-dominant-assurance-contracts-are-so-special
submitted by Vespco to Monero [link] [comments]

A (hopefully mathematically neutral) comparison of Lightning network fees to Bitcoin Cash on-chain fees.

A side note before I begin
For context, earlier today, sherlocoin made a post on this sub asking if Lightning Network transactions are cheaper than on-chain BCH transactions. This user also went on to complain on /bitcoin that his "real" numbers were getting downvoted
I was initially going to respond to his post, but after I typed some of my response, I realized it is relevant to a wider Bitcoin audience and the level of analysis done warranted a new post. This wound up being the longest post I've ever written, so I hope you agree.
I've placed the TL;DR at the top and bottom for the simple reason that you need to prepare your face... because it's about to get hit with a formidable wall of text.
TL;DR: While Lightning node payments themselves cost less than on-chain BCH payments, the associated overhead currently requires a LN channel to produce 16 transactions just to break-even under ideal 1sat/byte circumstances and substantially more as the fee rate goes up.
Further, the Lightning network can provide no guarantee in its current state to maintain/reduce fees to 1sat/byte.

Let's Begin With An Ideal World
Lightning network fees themselves are indeed cheaper than Bitcoin Cash fees, but in order to get to a state where a Lightning network fee can be made, you are required to open a channel, and to get to a state where those funds are spendable, you must close that channel.
On the Bitcoin network, the minimum accepted fee is 1sat/byte so for now, we'll assume that ideal scenario of 1sat/byte. We'll also assume the open and close is sent as a simple native Segwit transaction with a weighted size of 141 bytes. Because we have to both open and close, this 141 byte fee will be incurred twice. The total fee for an ideal open/close transaction is 1.8¢
For comparison, a simple transaction on the BCH network requires 226 bytes one time. The minimum fee accepted next-block is 1sat/byte. At the time of writing an ideal BCH transaction fee costs ~ 0.11¢
This means that under idealized circumstances, you must currently make at least 16 transactions on a LN channel to break-even with fees
Compounding Factors
Our world is not ideal, so below I've listed compounding factors, common arguments, an assessment, and whether the problem is solvable.
Problem 1: Bitcoin and Bitcoin Cash prices are asymmetrical.
Common arguments:
BTC: If Bitcoin Cash had the same price, the fees would be far higher
Yes, this is true. If Bitcoin Cash had the same market price as Bitcoin, our ideal scenario changes substantially. An open and close on Bitcoin still costs 1.8¢ while a simple Bitcoin Cash transaction now costs 1.4¢. The break-even point for a Lightning Channel is now only 2 transactions.
Is this problem solvable?
Absolutely.
Bitcoin Cash has already proposed a reduction in fees to 1sat for every 10 bytes, and that amount can be made lower by later proposals. While there is no substantial pressure to implement this now, if Bitcoin Cash had the same usage as Bitcoin currently does, it is far more likely to be implemented. If implemented at the first proposed reduction rate, under ideal circumstances, a Lightning Channel would need to produce around 13 transactions for the new break even.
But couldn't Bitcoin reduce fees similarly
The answer there is really tricky. If you reduce on-chain fees, you reduce the incentive to use the Lightning Network as the network becomes more hospitable to micropaments. This would likely increase the typical mempool state and decrease the Lightning Channel count some. The upside is that when the mempool saturates with low transaction fees, users are then re-incentivized to use the lightning network after the lowes fees are saturated with transactions. This should, in theory, produce some level of a transaction fee floor which is probably higher on average than 0.1 sat/byte on the BTC network.
Problem 2: This isn't an ideal world, we can't assume 1sat/byte fees
Common arguments:
BCH: If you tried to open a channel at peak fees, you could pay $50 each way
BTC: LN wasn't implemented which is why the fees are low now
Both sides have points here. It's true that if the mempool was in the same state as it was in December of 2017, that a user could have potentially been incentivized to pay an open and close channel fee of up to 1000 sat/byte to be accepted in a reasonable time-frame.
With that being said, two factors have resulted in a reduced mempool size of Bitcoin: Increased Segwit and Lightning Network Usage, and an overall cooling of the market.
I'm not going to speculate as to what percentage of which is due to each factor. Instead, I'm going to simply analyze mempool statistics for the last few months where both factors are present.
Let's get an idea of current typical Bitcoin network usage fees by asking Johoe quick what the mempool looks like.
For the last few months, the bitcoin mempool has followed almost the exact same pattern. Highest usage happens between 10AM and 3PM EST with a peak around noon. Weekly, usage usually peaks on Tuesday or Wednesday with enough activity to fill blocks with at least minimum fee transactions M-F during the noted hours and usually just shy of block-filling capacity on Sat and Sun.
These observations can be additionally evidenced by transaction counts on bitinfocharts. It's also easier to visualize on bitinfocharts over a longer time-frame.
Opening a channel
Under pre-planned circumstances, you can offload channel creation to off-peak hours and maintain a 1sat/byte rate. The primary issue arises in situations where either 1) LN payments are accepted and you had little prior knowledge, or 2) You had a previous LN pathway to a known payment processor and one or more previously known intermediaries are offline or otherwise unresponsive causing the payment to fail.
Your options are:
A) Create a new LN channel on-the-spot where you're likely to incur current peak fee rates of 5-20sat/byte.
B) Create an on-chain payment this time and open a LN channel when fees are more reasonable.
C) Use an alternate currency for the transaction.
There is a fundamental divide among the status of C. Some people view Bitcoin as (primarily) a storage of value, and thus as long as there are some available onramps and offramps, the currency will hold value. There are other people who believe that fungibility is what gives cryptocurrency it's value and that option C would fundamentally undermine the value of the currency.
I don't mean to dismiss either argument, but option C opens a can of worms that alone can fill economic textbooks. For the sake of simplicity, we will throw out option C as a possibility and save that debate for another day. We will simply require that payment is made in crypto.
With option B, you would absolutely need to pay the peak rate (likely higher) for a single transaction as a Point-of-Sale scenario with a full mempool would likely require at least one confirm and both parties would want that as soon as possible after payment. It would not be unlikely to pay 20-40 sat/byte on a single transaction and then pay 1sat/byte for an open and close to enable LN payments later. Even in the low end, the total cost is 20¢ for on-chain + open + close.
With present-day-statistics, your LN would have to do 182 transactions to make up for the one peak on-chain transaction you were forced to do.
With option A, you still require one confirm. Let's also give the additional leeway that in this scenario you have time to sit and wait a couple of blocks for your confirm before you order / pay. You can thus pay peak rates alone and not peak + ensure next block rates. This will most likely be in the 5-20 sat/byte range. With 5sat/byte open and 1sat/byte close, your LN would have to do 50 transactions to break even
In closing, fees are incurred by the funding channel, so there could be scenarios where the receiving party is incentivized to close in order to spend outputs and the software automatically calculates fees based on current rates. If this is the case, the receiving party could incur a higher-than-planned fee to the funding party.
With that being said, any software that allows the funding party to set the fee beforehand would avoid unplanned fees, so we'll assume low fees for closing.
Is this problem solvable?
It depends.
In order to avoid the peak-fee open/close ratio problem, the Bitcoin network either needs to have much higher LN / Segwit utilization, or increase on-chain capacity. If it gets to a point where transactions stack up, users will be required to pay more than 1sat/byte per transaction and should expect as much.
Current Bitcoin network utilization is close enough to 100% to fill blocks during peak times. I also did an export of the data available at Blockchair.com for the last 3000 blocks which is approximately the last 3 weeks of data. According to their block-weight statistics, The average Bitcoin block is 65.95% full. This means that on-chain, Bitcoin can only increase in transaction volume by around 50% and all other scaling must happen via increased Segwit and LN use.
Problem 3: You don't fully control your LN channel states.
Common arguments:
BCH: You can get into a scenario where you don't have output capacity and need to open a new channel.
BCH: A hostile actor can cause you to lose funds during a high-fee situation where a close is forced.
BTC: You can easily re-load your channel by pushing outbound to inbound.
BCH: You can't control whether nodes you connect to are online or offline.
There's a lot to digest here, but LN is essentially a 2-way contract between 2 parties. Not only does the drafting party pay the fees as of right now, but connected 3rd-parties can affect the state of this contract. There are some interesting scenarios that develop because of it and you aren't always in full control of what side.
Lack of outbound capacity
First, it's true that if you run out of outbound capacity, you either need to reload or create a new channel. This could potentially require 0, 1, or 2 additional on-chain transactions.
If a network loop exists between a low-outbound-capacity channel and yourself, you could push transactional capacity through the loop back to the output you wish to spend to. This would require 0 on-chain transactions and would only cost 1 (relatively negligible) LN fee charge. For all intents and purposes... this is actually kind of a cool scenario.
If no network loop exists from you-to-you, things get more complex. I've seen proposals like using Bitrefill to push capacity back to your node. In order to do this, you would have an account with them and they would lend custodial support based on your account. While people opting for trustless money would take issue in 3rd party custodians, I don't think this alone is a horrible solution to the LN outbound capacity problem... Although it depends on the fee that bitrefill charges to maintain an account and account charges could negate the effectiveness of using the LN. Still, we will assume this is a 0 on-chain scenario and would only cost 1 LN fee which remains relatively negligible.
If no network loop exists from you and you don't have a refill service set up, you'll need at least one on-chain payment to another LN entity in exchange for them to push LN capacity to you. Let's assume ideal fee rates. If this is the case, your refill would require an additional 7 transactions for that channel's new break-even. Multiply that by number of sat/byte if you have to pay more.
Opening a new channel is the last possibility and we go back to the dynamics of 13 transactions per LN channel in the ideal scenario.
Hostile actors
There are some potential attack vectors previously proposed. Most of these are theoretical and/or require high fee scenarios to come about. I think that everyone should be wary of them, however I'm going to ignore most of them again for the sake of succinctness.
This is not to be dismissive... it's just because my post length has already bored most casual readers half to death and I don't want to be responsible for finishing the job.
Pushing outbound to inbound
While I've discussed scenarios for this push above, there are some strange scenarios that arise where pushing outbound to inbound is not possible and even some scenarios where a 3rd party drains your outbound capacity before you can spend it.
A while back I did a testnet simulation to prove that this scenario can and will happen it was a post response that happened 2 weeks after the initial post so it flew heavily under the radar, but the proof is there.
The moral of this story is in some scenarios, you can't count on loaded network capacity to be there by the time you want to spend it.
Online vs Offline Nodes
We can't even be sure that a given computer is online to sign a channel open or push capacity until we try. Offline nodes provide a brick-wall in the pathfinding algorithm so an alternate route must be found. If we have enough channel connectivity to be statistically sure we can route around this issue, we're in good shape. If not, we're going to have issues.
Is this problem solvable?
Only if the Lightning network can provide an (effectively) infinite amount of capacity... but...
Problem 4: Lightning Network is not infinite.
Common arguments:
BTC: Lightning network can scale infinitely so there's no problem.
Unfortunately, LN is not infinitely scalable. In fact, finding a pathway from one node to another is roughly the same problem as the traveling salesman problem. Dijkstra's algorithm which is a problem that diverges polynomially. The most efficient proposals have a difficulty bound by O(n^2).
Note - in the above I confused the complexity of the traveling salesman problem with Dijkstra when they do not have the same bound. With that being said, the complexity of the LN will still diverge with size
In lay terms, what that means is every time you double the size of the Lightning Network, finding an indirect LN pathway becomes 4 times as difficult and data intensive. This means that for every doubling, the amount of traffic resulting from a single request also quadruples.
You can potentially temporarily mitigate traffic by bounding the number of hops taken, but that would encourage a greater channel-per-user ratio.
For a famous example... the game "6 degrees of Kevin Bacon" postulates that Kevin Bacon can be connected by co-stars to any movie by 6 degrees of separation. If the game is reduced to "4 degrees of Kevin Bacon," users of this network would still want as many connections to be made, so they'd be incentivized to hire Kevin Bacon to star in everything. You'd start to see ridiculous mash-ups and reboots just to get more connectivity... Just imagine hearing Coming soon - Kevin Bacon and Adam Sandlar star in "Billy Madison 2: Replace the face."
Is this problem solvable?
Signs point to no.
So technically, if the average computational power and network connectivity can handle the problem (the number of Lightning network channels needed to connect the world)2 in a trivial amount of time, Lightning Network is effectively infinite as the upper bound of a non-infinite earth would limit time-frames to those that are computationally feasible.
With that being said, BTC has discussed Lightning dev comments before that estimated a cap of 10,000 - 1,000,000 channels before problems are encountered which is far less than the required "number of channels needed to connect the world" level.
In fact SHA256 is a newer NP-hard problem than the traveling saleseman problem. That means that statistically, and based on the amount of review that has been given to each problem, it is more likely that SHA256 - the algorithm that lends security to all of bitcoin - is cracked before the traveling salesman problem is. Notions that "a dedicated dev team can suddenly solve this problem, while not technically impossible, border on statistically absurd.
Edit - While the case isn't quite as bad as the traveling salesman problem, the problem will still diverge with size and finding a more efficient algorithm is nearly as unlikely.
This upper bound shows that we cannot count on infinite scalability or connectivity for the lightning network. Thus, there will always be on-chain fee pressure and it will rise as the LN reaches it's computational upper-bound.
Because you can't count on channel states, the on-chain fee pressure will cause typical sat/byte fees to raise. The higher this rate, the more transactions you have to make for a Lightning payment open/close operation to pay for itself.
This is, of course unless it is substantially reworked or substituted for a O(log(n))-or-better solution.
Finally, I'd like to add, creating an on-chain transaction is a set non-recursive, non looping function - effectively O(1), sending this transaction over a peer-to-peer network is bounded by O(log(n)) and accepting payment is, again, O(1). This means that (as far as I can tell) on-chain transactions (very likely) scale more effectively than Lightning Network in its current state.
Additional notes:
My computational difficulty assumptions were based on a generalized, but similar problem set for both LN and on-chain instances. I may have overlooked additional steps needed for the specific implementation, and I may have overlooked reasons a problem is a simplified version requiring reduced computational difficulty.
I would appreciate review and comment on my assumptions for computational difficulty and will happily correct said assumptions if reasonable evidence is given that a problem doesn't adhere to listed computational difficulty.
TL;DR: While Lightning node payments themselves cost less than on-chain BCH payments, the associated overhead currently requires a LN channel to produce 16 transactions just to break-even under ideal 1sat/byte circumstances and substantially more as the fee rate goes up.
Further, the Lightning network can provide no guarantee in its current state to maintain/reduce fees to 1sat/byte.
submitted by CaptainPatent to btc [link] [comments]

Intersecting and competing interests of miners vs. investors

This post is pure speculation but it's something that I've been thinking about for a while. This post is informational - it's not a quick FUD/FOMO analysis. However, I do make a case for being a long-term bull (i.e. years).
There are two major groups with large individual resources: miners and crypto investors. These aren't your general traders, these are large, multi-million dollar groups (or larger). Let's look at motivations of both to see how it can relate to prices.
Crypto Miners
Miners obviously want maximum profit. There are several ways to do this:
Note that Bitcoin's difficulty is at all-time high. Litecoin too. Increased difficulty means the same equipment will take longer to generate the same reward. Also note that with the upcoming halving - coming in a month for Litecoin and next year for Bitcoin - the reward for each crypto will significantly decrease. This means that - all else being equal - the profit for miners will drop significantly (temporarily, at least).
The other news is that your typical miner isn't making a lot of money. Like many other examples, economies of scale come into play and your big investors that have large facilities and equipment are the ones making more money. This means more power in the hands of fewer people who have a larger investment with their various interests. How is an individual going to compete with something like this?
Also note that when the crypto market fell at the end of 2017, miner manufacturers had losses due to lack of new buyers. This led to a collapse in prices for various ASIC equipment and related hardware. This does affect stock market prices. Although crypto hardware isn't exactly a huge profit center, check out stock prices for AMD, Intel, and NVidia for the last 5 years. You'll see articles like this and this that support my conclusions. Someone could dig more into this to get better numbers.
Crypto Investors
Crypto investors (the whales), don't really care as much about buying vs. selling - they can profit in either move in the price. However, shorting is risky and shorting crypto is very risky so more are likely to err on the side of growth. It also benefits them for any large swing in prices as opposed to steady growth. They want the market to continue to grow since if it shrinks, it can be destroyed and their profits will go away. They also don't want the market to get too large too fast but some things are beyond their control once they overheat. They're frustrated since they want to pump a lot of money into this - for massive profits - but this attention will be noticed. For instance, if some whale invests $50b into Bitcoin, it'll cause havok on the market and the prices so they have to have relatively small investments. The big institutions want to throw more money into it but they know that if they do, the market will get out of hand. Being noticed invites unwanted regulations and this leads to loss of control and, likely, lower prices with less opportunity.
Note that the interests of both miners and investors sometimes overlap. For instance, miners want the crypto price to be higher so they have higher profits. Investors will also receive the rewards through higher prices.
However, sometimes their interests are in conflict. For instance, if I was running a mining business and I had some resources, here's what I see: an increasing rise in costs due to higher ASIC prices, lower reward due to higher difficulty, and lower reward due to halving.
What's my solution? I would:
You can see how investors could be working for this where some miners could get money together to hire professional traders to do this. Same with companies like AMD, Intel, NVidia, and others (ex: Samsung) who stand to make a lot of money selling this equipment.
The simple problem with crypto is for it to succeed:
The only solution is for the miners - and their suppliers - to continue to pump crypto prices higher to maximize their profits... indefinitely. Investors help out with raising prices but they also help when the market overheats and they cash out and/or short. A market crash temporarily helps miners who can now buy cheaper equipment.
We've all seen charts like these. How else can you explain such projectors (due to past history)? You do that with the continued - almost mathematically calculated - rises and falls in prices over time. If you add in difficulty, ASIC prices, and miner profitability, I'm sure you'll see a pattern. Larger difficulty (i.e. more costs) and higher hardware prices require higher crypto prices for miners to continue to be in business. Considering the market is still relatively small, it's easier to manipulate for higher prices.
submitted by SsurebreC to LitecoinTraders [link] [comments]

Research on ChainLink / The future might be bright

Research on ChainLink / The future might be bright
I'm a bit skeptical about this project merely because I can't find the real usage of ChainLink and its tech in real life(At the moment) after I've read the white paper. I would be grateful if you my dear reader would like to discuss it. Before I start to review the project I want to clarify why I think the future might be bright and the present time, however, has a numerous problems that must be addressed.
The main problem is The Technology Adoption.
The technology adoption life cycle is a sociological model that describes the adoption or acceptance of a new product or innovation, in our case we can define the Cryptocurrency as the New Product, I hope that I am correct - that at the moment we are on the stage between Innovators and Early Adopters. My point is, so far a lot of people never heard about cryptocurrencies or even about Bitcoin and they haven't any clue why they need tokens or digital currencies, then that lack of education still leads to slow growth of popularization of cryptocurrencies and that requires the time. I don't talk about Venezuela it's a different story, people are terrified of the government and inflation, I talk about stable countries.
In additional, on the stage of early adoption, strategy is a critical part of the business model of any startup, and when customers must pay for a new product via new money aka cryptocurrencies, a number of issues arise for which it will be necessary to have a clear understanding.

https://preview.redd.it/rh978mvaecn21.jpg?width=1280&format=pjpg&auto=webp&s=a94569b3355258b59e37cd1803b8eae7df91c169
The second problem is high volatile and low liquidity of cryptocurrencies.
This problem affects almost on the whole sector. Of course, there are a lot of big and small online exchanges but all of them request KYC verification procedure which a potential barrier to the use cryptocurrencies. How many potential clients, StartUp can lose if it accepts only cryptocurrencies as payment method? If crypto payments would as one of variants with solid discount to cover exchanges fees of customers only then StartUp will greatly benefit from the increasing its clients' base.
For business. In the context of the high volatile of cryptocurrencies, the using of tokens can bring some problems for companies. For instance, I want to offer data feed to customers, first of all I think about the pricing structure for users, so if the token price will rise on 10% then I must force down the price in USD/EUFiat immediately, after then I need to exchange received tokens for USD or EUR or my national currency, because I pay taxes and my government doesn't accept cryptocurrencies. If even this process will be automatically, I will still pay fees on every stage and every fee is my potential profit.
White Paper.
White Paper were well written and required minimal tech skills to understand what to expect of such a new vision of the smart-contracts. So, this time, we're being offered to assess the future where the ChainLink provides algorithms on-chain and off-chain for B2B, B2C, C2C based on decentralized blockchain.
The on-chain algorithm.
"The ChainLink system proposes the use of a simple protocol involving threshold signatures. Such signatures can be realized using any of a number of signature schemes, but are especially simple to implement using Schnorr signature."
This algorithm is secure enough and include a rational steps for that. That means a confidentiality of a request to trustworthy data source will be protected and can't be used or revealed by compromised node.
"in order to decrypt an encrypted message or to sign a message, several parties (more than some threshold number) must cooperate in the decryption or signature protocol. The message is encrypted using a public key and the corresponding private key is shared among the participating parties." Wikipedia.
And furthermore, if you worry that single data source can be compromise by unauthorized persons than you can obtain data from several sources, also it effectively prevents the possibility of incorrect answer. But the off-chain algorithm is more interesting.
The off-chain algorithm.
Very interesting technology which can be used by government(as I think) for secure data transmission over the internet, so I recommend you to read how the Intel SGX works, the official site of Town Crier and then re-read the white paper. As written on the main page of Town Crier: "The Town Crier system leverages trusted hardware (Intel SGX) to provide a strong guarantee that data comes from an existing, trustworthy source. It also provides confidentiality, enabling smart contracts to support confidential queries and even manage user credentials." All three sources will get you more helpful information than I can write here. Here is nothing more I can think of to say, because this is the long-term technical strategy for ChainLink and it's still on development.
I almost agree to pay for a request. or Where is my profit?
So let's take some examples.
Shipping. My business model offers cheap shipping of parcels also you can track its geographical coordinates, the similar as you would track your Uber ride. No one on the market can't offer the analogous option, so I have the market where I can dictate the rules, then I notice that with the ChainLink platform, my business model can reduce significant cost on the prices of shipping if my customers will pay for each request of coordinates. Wow! My company will receive (If I correctly understand from the white paper, when user pay fees for the request, this fees divided on several parts, where my service receive one part and another one sends to the node or pool) extra fees from my customers. But let's back to customers, if my App will use the ChainLink platform then my users must pay via $Link token to know where are their parcels.
And here we must back to the major problem "The Technology Adoption", I know that my potential customer prefers to make an online payment via VISA or Master card instead of cryptocurrencies, in the result my potential customers will encounter difficulties like where to buy Bitcoin or Ethereum, how to exchange BTC or ETH for $Link token (The problem number two) and others small problems, don't forget about exchanges fees. And what do you think whose services they will choose? DHL, FedEx, UPS or maybe my company? Okay, you can say, at the moment you can use Bitcoin ATM, and then transfer cryptocurrency to the Enjin Wallet, where is easily exchange Bitcoin or Ethereum for any ERC20 token include $Link token. BUT a lot of people don't know about that and this is a problem. As well, my method incurs some sort of fees too.
Market data. For example, my company have exclusive rights on real-time stocks market data, that I sell in my App, and I can offer a price that's lower than the competition's. Well that's good for my business, BUT my clients want to obtain data instantaneously and here I can't offer them high speed of answers because API of my App works on the ChainLink platform. You can create and test your own smart-contract here to calculate the time from request to answer.
With all these issues will face every StartUp which want to work on the ChainLink platform, even if StartUp can offer something new to us, but we as users must pay for service through $Link token then this StartUp will lose a significant proportion of customers. Or StartAp must offer super exclusive data to create significant value for its client base.
In conclusion.
I don't want to blame the ChainLink platform, I want to say this project offers us interesting possibilities, but it depends on the circumstances of nowadays. In my view, when I can buy cryptocurrencies just as easily as ice-cream, then all things will depend on what can I offer you, to exchange my exclusive data for your $Link token.
And Finally, I would like to clarify that all these problems above are universal, but the recent projects which I reviewed have one major bonus, they have a working platform/site with loyal customers. On the other side the ChainLink platform, it has a good idea too, but the area of its usage hasn't the mainstream user, that leads to slow adoption of technology and higher levels of frustration for investors.
PS.
It's not a financial advice.
I haven't any investments in $LINK.
I continue to follow /ChainLink.
I want to apologize for possible errors bc english is not my native language.
submitted by Fanfan_la_Tulip to SAFU [link] [comments]

What is the Difficulty Target? Explaining Bitcoin Target Difficulty 3 Primary Bitcoin Participants Affecting Price Crypto Mining Difficulty 101 - Everything You Need to Know Bitcoin Difficulty Explained Some Known Facts About Bitcoin Return Calculator - Investment on Any Date and Inflation.

To see real-world calculations of how the difficulty affects the coins discovered per time spent, see any mining profitability calculator, and change the "difficulty" figure. The Bitcoin wiki has details on difficulty, as well. I don't know what a hash is either. To conclude, a Bitcoin mining calculator can give you a much better idea about your potential to run a profitable mining operation. Remember, however, that some factors such as Bitcoin’s price and mining difficulty, change every day and can have dramatic effects on profitability, so it’s important to conduct up to date calculations when needed. On the average, Bitcoin blocks are mined every 10 minutes. Higher difficulty level implies that more hash power will join the network. This in turn means miners control lower percentage of network hash power. Network complexity and hash rate are external issues that miners must factor in. Miners cannot determine difficulty in advance. Since the price floor set by the difficulty adjustment ties breakeven cost and price together, the breakeven cost trend is a reasonable predictor of the future price of Bitcoin. Coin Metrics points out that the recent 16% decline in mining difficulty — largely caused by the sharp Bitcoin price decline in the middle of March — is a sign that inefficient miners are

[index] [6587] [11865] [7232] [10261] [4501] [16859] [29651] [15635] [19284] [9810]

What is the Difficulty Target? Explaining Bitcoin Target Difficulty

Some Known Facts About Bitcoin Return Calculator - Investment on Any Date and Inflation. ... The difficulty of mining goes up by a factor of about 14 %every thirty days. So more and more powerful ... To calculate New Difficulty Target, take old difficulty target X 20160 minutes (which is amount of time in minutes for 2 weeks) Divided by the time in minutes of the last 2016 blocks. Bitcoin difficulty ribbon - willy woo’s bitcoin difficulty ribbon suggests drop below $6,000/btc “very unlikely”. Click below and become rich today https://bit.ly/bitcoins0 Credit to Willy ... In this clip Matt D'Souza talks about the 3 primary participants in the Bitcoin network and how they affect price. He goes into detail about Bitcoin miners and mining difficulty. Matt D’Souza is ... #bitcoin mining calculator 2020 #bitcoin mining difficulty 2020 #vitrade. Loading... Autoplay When autoplay is enabled, a suggested video will automatically play next. Up next

Flag Counter