Obtain BTC with Smart Mining - Bitcoin Generator Tools

Scala

[link]

Locky Ransomware - Encrypts Documents, Databases, Code, BitCoin Wallets and More... | Forcepoint / Domain Generation Code

Locky Ransomware - Encrypts Documents, Databases, Code, BitCoin Wallets and More... | Forcepoint / Domain Generation Code submitted by p33 to Malware [link] [comments]

Proposal: The Sia Foundation

Vision Statement

A common sentiment is brewing online; a shared desire for the internet that might have been. After decades of corporate encroachment, you don't need to be a power user to realize that something has gone very wrong.
In the early days of the internet, the future was bright. In that future, when you sent an instant message, it traveled directly to the recipient. When you needed to pay a friend, you announced a transfer of value to their public key. When an app was missing a feature you wanted, you opened up the source code and implemented it. When you took a picture on your phone, it was immediately encrypted and backed up to storage that you controlled. In that future, people would laugh at the idea of having to authenticate themselves to some corporation before doing these things.
What did we get instead? Rather than a network of human-sized communities, we have a handful of enormous commons, each controlled by a faceless corporate entity. Hey user, want to send a message? You can, but we'll store a copy of it indefinitely, unencrypted, for our preference-learning algorithms to pore over; how else could we slap targeted ads on every piece of content you see? Want to pay a friend? You can—in our Monopoly money. Want a new feature? Submit a request to our Support Center and we'll totally maybe think about it. Want to backup a photo? You can—inside our walled garden, which only we (and the NSA, of course) can access. Just be careful what you share, because merely locking you out of your account and deleting all your data is far from the worst thing we could do.
You rationalize this: "MEGACORP would never do such a thing; it would be bad for business." But we all know, at some level, that this state of affairs, this inversion of power, is not merely "unfortunate" or "suboptimal" – No. It is degrading. Even if MEGACORP were purely benevolent, it is degrading that we must ask its permission to talk to our friends; that we must rely on it to safeguard our treasured memories; that our digital lives are completely beholden to those who seek only to extract value from us.
At the root of this issue is the centralization of data. MEGACORP can surveil you—because your emails and video chats flow through their servers. And MEGACORP can control you—because they hold your data hostage. But centralization is a solution to a technical problem: How can we make the user's data accessible from anywhere in the world, on any device? For a long time, no alternative solution to this problem was forthcoming.
Today, thanks to a confluence of established techniques and recent innovations, we have solved the accessibility problem without resorting to centralization. Hashing, encryption, and erasure encoding got us most of the way, but one barrier remained: incentives. How do you incentivize an anonymous stranger to store your data? Earlier protocols like BitTorrent worked around this limitation by relying on altruism, tit-for-tat requirements, or "points" – in other words, nothing you could pay your electric bill with. Finally, in 2009, a solution appeared: Bitcoin. Not long after, Sia was born.
Cryptography has unleashed the latent power of the internet by enabling interactions between mutually-distrustful parties. Sia harnesses this power to turn the cloud storage market into a proper marketplace, where buyers and sellers can transact directly, with no intermediaries, anywhere in the world. No more silos or walled gardens: your data is encrypted, so it can't be spied on, and it's stored on many servers, so no single entity can hold it hostage. Thanks to projects like Sia, the internet is being re-decentralized.
Sia began its life as a startup, which means it has always been subjected to two competing forces: the ideals of its founders, and the profit motive inherent to all businesses. Its founders have taken great pains to never compromise on the former, but this often threatened the company's financial viability. With the establishment of the Sia Foundation, this tension is resolved. The Foundation, freed of the obligation to generate profit, is a pure embodiment of the ideals from which Sia originally sprung.
The goals and responsibilities of the Foundation are numerous: to maintain core Sia protocols and consensus code; to support developers building on top of Sia and its protocols; to promote Sia and facilitate partnerships in other spheres and communities; to ensure that users can easily acquire and safely store siacoins; to develop network scalability solutions; to implement hardforks and lead the community through them; and much more. In a broader sense, its mission is to commoditize data storage, making it cheap, ubiquitous, and accessible to all, without compromising privacy or performance.
Sia is a perfect example of how we can achieve better living through cryptography. We now begin a new chapter in Sia's history. May our stewardship lead it into a bright future.
 

Overview

Today, we are proposing the creation of the Sia Foundation: a new non-profit entity that builds and supports distributed cloud storage infrastructure, with a specific focus on the Sia storage platform. What follows is an informal overview of the Sia Foundation, covering two major topics: how the Foundation will be funded, and what its funds will be used for.

Organizational Structure

The Sia Foundation will be structured as a non-profit entity incorporated in the United States, likely a 501(c)(3) organization or similar. The actions of the Foundation will be constrained by its charter, which formalizes the specific obligations and overall mission outlined in this document. The charter will be updated on an annual basis to reflect the current goals of the Sia community.
The organization will be operated by a board of directors, initially comprising Luke Champine as President and Eddie Wang as Chairman. Luke Champine will be leaving his position at Nebulous to work at the Foundation full-time, and will seek to divest his shares of Nebulous stock along with other potential conflicts of interest. Neither Luke nor Eddie personally own any siafunds or significant quantities of siacoin.

Funding

The primary source of funding for the Foundation will come from a new block subsidy. Following a hardfork, 30 KS per block will be allocated to the "Foundation Fund," continuing in perpetuity. The existing 30 KS per block miner reward is not affected. Additionally, one year's worth of block subsidies (approximately 1.57 GS) will be allocated to the Fund immediately upon activation of the hardfork.
As detailed below, the Foundation will provably burn any coins that it cannot meaningfully spend. As such, the 30 KS subsidy should be viewed as a maximum. This allows the Foundation to grow alongside Sia without requiring additional hardforks.
The Foundation will not be funded to any degree by the possession or sale of siafunds. Siafunds were originally introduced as a means of incentivizing growth, and we still believe in their effectiveness: a siafund holder wants to increase the amount of storage on Sia as much as possible. While the Foundation obviously wants Sia to succeed, its driving force should be its charter. Deriving significant revenue from siafunds would jeopardize the Foundation's impartiality and focus. Ultimately, we want the Foundation to act in the best interests of Sia, not in growing its own budget.

Responsibilities

The Foundation inherits a great number of responsibilities from Nebulous. Each quarter, the Foundation will publish the progress it has made over the past quarter, and list the responsibilities it intends to prioritize over the coming quarter. This will be accompanied by a financial report, detailing each area of expenditure over the past quarter, and forecasting expenditures for the coming quarter. Below, we summarize some of the myriad responsibilities towards which the Foundation is expected to allocate its resources.

Maintain and enhance core Sia software

Arguably, this is the most important responsibility of the Foundation. At the heart of Sia is its consensus algorithm: regardless of other differences, all Sia software must agree upon the content and rules of the blockchain. It is therefore crucial that the algorithm be stewarded by an entity that is accountable to the community, transparent in its decision-making, and has no profit motive or other conflicts of interest.
Accordingly, Sia’s consensus functionality will no longer be directly maintained by Nebulous. Instead, the Foundation will release and maintain an implementation of a "minimal Sia full node," comprising the Sia consensus algorithm and P2P networking code. The source code will be available in a public repository, and signed binaries will be published for each release.
Other parties may use this code to provide alternative full node software. For example, Nebulous may extend the minimal full node with wallet, renter, and host functionality. The source code of any such implementation may be submitted to the Foundation for review. If the code passes review, the Foundation will provide "endorsement signatures" for the commit hash used and for binaries compiled internally by the Foundation. Specifically, these signatures assert that the Foundation believes the software contains no consensus-breaking changes or other modifications to imported Foundation code. Endorsement signatures and Foundation-compiled binaries may be displayed and distributed by the receiving party, along with an appropriate disclaimer.
A minimal full node is not terribly useful on its own; the wallet, renter, host, and other extensions are what make Sia a proper developer platform. Currently, the only implementations of these extensions are maintained by Nebulous. The Foundation will contract Nebulous to ensure that these extensions continue to receive updates and enhancements. Later on, the Foundation intends to develop its own implementations of these extensions and others. As with the minimal node software, these extensions will be open source and available in public repositories for use by any Sia node software.
With the consensus code now managed by the Foundation, the task of implementing and orchestrating hardforks becomes its responsibility as well. When the Foundation determines that a hardfork is necessary (whether through internal discussion or via community petition), a formal proposal will be drafted and submitted for public review, during which arguments for and against the proposal may be submitted to a public repository. During this time, the hardfork code will be implemented, either by Foundation employees or by external contributors working closely with the Foundation. Once the implementation is finished, final arguments will be heard. The Foundation board will then vote whether to accept or reject the proposal, and announce their decision along with appropriate justification. Assuming the proposal was accepted, the Foundation will announce the block height at which the hardfork will activate, and will subsequently release source code and signed binaries that incorporate the hardfork code.
Regardless of the Foundation's decision, it is the community that ultimately determines whether a fork is accepted or rejected – nothing can change that. Foundation node software will never automatically update, so all forks must be explicitly adopted by users. Furthermore, the Foundation will provide replay and wipeout protection for its hard forks, protecting other chains from unintended or malicious reorgs. Similarly, the Foundation will ensure that any file contracts formed prior to a fork activation will continue to be honored on both chains until they expire.
Finally, the Foundation also intends to pursue scalability solutions for the Sia blockchain. In particular, work has already begun on an implementation of Utreexo, which will greatly reduce the space requirements of fully-validating nodes (allowing a full node to be run on a smartphone) while increasing throughput and decreasing initial sync time. A hardfork implementing Utreexo will be submitted to the community as per the process detailed above.
As this is the most important responsibility of the Foundation, it will receive a significant portion of the Foundation’s budget, primarily in the form of developer salaries and contracting agreements.

Support community services

We intend to allocate 25% of the Foundation Fund towards the community. This allocation will be held and disbursed in the form of siacoins, and will pay for grants, bounties, hackathons, and other community-driven endeavours.
Any community-run service, such as a Skynet portal, explorer or web wallet, may apply to have its costs covered by the Foundation. Upon approval, the Foundation will reimburse expenses incurred by the service, subject to the exact terms agreed to. The intent of these grants is not to provide a source of income, but rather to make such services "break even" for their operators, so that members of the community can enrich the Sia ecosystem without worrying about the impact on their own finances.

Ensure easy acquisition and storage of siacoins

Most users will acquire their siacoins via an exchange. The Foundation will provide support to Sia-compatible exchanges, and pursue relevant integrations at its discretion, such as Coinbase's new Rosetta standard. The Foundation may also release DEX software that enables trading cryptocurrencies without the need for a third party. (The Foundation itself will never operate as a money transmitter.)
Increasingly, users are storing their cryptocurrency on hardware wallets. The Foundation will maintain the existing Ledger Nano S integration, and pursue further integrations at its discretion.
Of course, all hardware wallets must be paired with software running on a computer or smartphone, so the Foundation will also develop and/or maintain client-side wallet software, including both full-node wallets and "lite" wallets. Community-operated wallet services, i.e. web wallets, may be funded via grants.
Like core software maintenance, this responsibility will be funded in the form of developer salaries and contracting agreements.

Protect the ecosystem

When it comes to cryptocurrency security, patching software vulnerabilities is table stakes; there are significant legal and social threats that we must be mindful of as well. As such, the Foundation will earmark a portion of its fund to defend the community from legal action. The Foundation will also safeguard the network from 51% attacks and other threats to network security by implementing softforks and/or hardforks where necessary.
The Foundation also intends to assist in the development of a new FOSS software license, and to solicit legal memos on various Sia-related matters, such as hosting in the United States and the EU.
In a broader sense, the establishment of the Foundation makes the ecosystem more robust by transferring core development to a more neutral entity. Thanks to its funding structure, the Foundation will be immune to various forms of pressure that for-profit companies are susceptible to.

Drive adoption of Sia

Although the overriding goal of the Foundation is to make Sia the best platform it can be, all that work will be in vain if no one uses the platform. There are a number of ways the Foundation can promote Sia and get it into the hands of potential users and developers.
In-person conferences are understandably far less popular now, but the Foundation can sponsor and/or participate in virtual conferences. (In-person conferences may be held in the future, permitting circumstances.) Similarly, the Foundation will provide prizes for hackathons, which may be organized by community members, Nebulous, or the Foundation itself. Lastly, partnerships with other companies in the cryptocurrency space—or the cloud storage space—are a great way to increase awareness of Sia. To handle these responsibilities, one of the early priorities of the Foundation will be to hire a marketing director.

Fund Management

The Foundation Fund will be controlled by a multisig address. Each member of the Foundation's board will control one of the signing keys, with the signature threshold to be determined once the final composition of the board is known. (This threshold may also be increased or decreased if the number of board members changes.) Additionally, one timelocked signing key will be controlled by David Vorick. This key will act as a “dead man’s switch,” to be used in the event of an emergency that prevents Foundation board members from reaching the signature threshold. The timelock ensures that this key cannot be used unless the Foundation fails to sign a transaction for several months.
On the 1st of each month, the Foundation will use its keys to transfer all siacoins in the Fund to two new addresses. The first address will be controlled by a high-security hot wallet, and will receive approximately one month's worth of Foundation expenditures. The second address, receiving the remaining siacoins, will be a modified version of the source address: specifically, it will increase the timelock on David Vorick's signing key by one month. Any other changes to the set of signing keys, such as the arrival or departure of board members, will be incorporated into this address as well.
The Foundation Fund is allocated in SC, but many of the Foundation's expenditures must be paid in USD or other fiat currency. Accordingly, the Foundation will convert, at its discretion, a portion of its monthly withdrawals to fiat currency. We expect this conversion to be primarily facilitated by private "OTC" sales to accredited investors. The Foundation currently has no plans to speculate in cryptocurrency or other assets.
Finally, it is important that the Foundation adds value to the Sia platform well in excess of the inflation introduced by the block subsidy. For this reason, the Foundation intends to provably burn, on a quarterly basis, any coins that it cannot allocate towards any justifiable expense. In other words, coins will be burned whenever doing so provides greater value to the platform than any other use. Furthermore, the Foundation will cap its SC treasury at 5% of the total supply, and will cap its USD treasury at 4 years’ worth of predicted expenses.
 
Addendum: Hardfork Timeline
We would like to see this proposal finalized and accepted by the community no later than September 30th. A new version of siad, implementing the hardfork, will be released no later than October 15th. The hardfork will activate at block 293220, which is expected to occur around 12pm EST on January 1st, 2021.
 
Addendum: Inflation specifics
The total supply of siacoins as of January 1st, 2021 will be approximately 45.243 GS. The initial subsidy of 1.57 GS thus increases the supply by 3.47%, and the total annual inflation in 2021 will be at most 10.4% (if zero coins are burned). In 2022, total annual inflation will be at most 6.28%, and will steadily decrease in subsequent years.
 

Conclusion

We see the establishment of the Foundation as an important step in the maturation of the Sia project. It provides the ecosystem with a sustainable source of funding that can be exclusively directed towards achieving Sia's ambitious goals. Compared to other projects with far deeper pockets, Sia has always punched above its weight; once we're on equal footing, there's no telling what we'll be able to achieve.
Nevertheless, we do not propose this change lightly, and have taken pains to ensure that the Foundation will act in accordance with the ideals that this community shares. It will operate transparently, keep inflation to a minimum, and respect the user's fundamental role in decentralized systems. We hope that everyone in the community will consider this proposal carefully, and look forward to a productive discussion.
submitted by lukechampine to siacoin [link] [comments]

Syscoin Platform’s Great Reddit Scaling Bake-off Proposal

Syscoin Platform’s Great Reddit Scaling Bake-off Proposal

https://preview.redd.it/rqt2dldyg8e51.jpg?width=1044&format=pjpg&auto=webp&s=777ae9d4fbbb54c3540682b72700fc4ba3de0a44
We are excited to participate and present Syscoin Platform's ideal characteristics and capabilities towards a well-rounded Reddit Community Points solution!
Our scaling solution for Reddit Community Points involves 2-way peg interoperability with Ethereum. This will provide a scalable token layer built specifically for speed and high volumes of simple value transfers at a very low cost, while providing sovereign ownership and onchain finality.
Token transfers scale by taking advantage of a globally sorting mempool that provides for probabilistically secure assumptions of “as good as settled”. The opportunity here for token receivers is to have an app-layer interactivity on the speed/security tradeoff (99.9999% assurance within 10 seconds). We call this Z-DAG, and it achieves high-throughput across a mesh network topology presently composed of about 2,000 geographically dispersed full-nodes. Similar to Bitcoin, however, these nodes are incentivized to run full-nodes for the benefit of network security, through a bonded validator scheme. These nodes do not participate in the consensus of transactions or block validation any differently than other nodes and therefore do not degrade the security model of Bitcoin’s validate first then trust, across every node. Each token transfer settles on-chain. The protocol follows Bitcoin core policies so it has adequate code coverage and protocol hardening to be qualified as production quality software. It shares a significant portion of Bitcoin’s own hashpower through merged-mining.
This platform as a whole can serve token microtransactions, larger settlements, and store-of-value in an ideal fashion, providing probabilistic scalability whilst remaining decentralized according to Bitcoin design. It is accessible to ERC-20 via a permissionless and trust-minimized bridge that works in both directions. The bridge and token platform are currently available on the Syscoin mainnet. This has been gaining recent attention for use by loyalty point programs and stablecoins such as Binance USD.

Solutions

Syscoin Foundation identified a few paths for Reddit to leverage this infrastructure, each with trade-offs. The first provides the most cost-savings and scaling benefits at some sacrifice of token autonomy. The second offers more preservation of autonomy with a more narrow scope of cost savings than the first option, but savings even so. The third introduces more complexity than the previous two yet provides the most overall benefits. We consider the third as most viable as it enables Reddit to benefit even while retaining existing smart contract functionality. We will focus on the third option, and include the first two for good measure.
  1. Distribution, burns and user-to-user transfers of Reddit Points are entirely carried out on the Syscoin network. This full-on approach to utilizing the Syscoin network provides the most scalability and transaction cost benefits of these scenarios. The tradeoff here is distribution and subscription handling likely migrating away from smart contracts into the application layer.
  2. The Reddit Community Points ecosystem can continue to use existing smart contracts as they are used today on the Ethereum mainchain. Users migrate a portion of their tokens to Syscoin, the scaling network, to gain much lower fees, scalability, and a proven base layer, without sacrificing sovereign ownership. They would use Syscoin for user-to-user transfers. Tips redeemable in ten seconds or less, a high-throughput relay network, and onchain settlement at a block target of 60 seconds.
  3. Integration between Matic Network and Syscoin Platform - similar to Syscoin’s current integration with Ethereum - will provide Reddit Community Points with EVM scalability (including the Memberships ERC777 operator) on the Matic side, and performant simple value transfers, robust decentralized security, and sovereign store-of-value on the Syscoin side. It’s “the best of both worlds”. The trade-off is more complex interoperability.

Syscoin + Matic Integration

Matic and Blockchain Foundry Inc, the public company formed by the founders of Syscoin, recently entered a partnership for joint research and business development initiatives. This is ideal for all parties as Matic Network and Syscoin Platform provide complementary utility. Syscoin offers characteristics for sovereign ownership and security based on Bitcoin’s time-tested model, and shares a significant portion of Bitcoin’s own hashpower. Syscoin’s focus is on secure and scalable simple value transfers, trust-minimized interoperability, and opt-in regulatory compliance for tokenized assets rather than scalability for smart contract execution. On the other hand, Matic Network can provide scalable EVM for smart contract execution. Reddit Community Points can benefit from both.
Syscoin + Matic integration is actively being explored by both teams, as it is helpful to Reddit, Ethereum, and the industry as a whole.

Proving Performance & Cost Savings

Our POC focuses on 100,000 on-chain settlements of token transfers on the Syscoin Core blockchain. Transfers and burns perform equally with Syscoin. For POCs related to smart contracts (subscriptions, etc), refer to the Matic Network proposal.
On-chain settlement of 100k transactions was accomplished within roughly twelve minutes, well-exceeding Reddit’s expectation of five days. This was performed using six full-nodes operating on compute-optimized AWS c4.2xlarge instances which were geographically distributed (Virginia, London, Sao Paulo Brazil, Oregon, Singapore, Germany). A higher quantity of settlements could be reached within the same time-frame with more broadcasting nodes involved, or using hosts with more resources for faster execution of the process.
Addresses used: 100,014
The demonstration was executed using this tool. The results can be seen in the following blocks:
612722: https://sys1.bcfn.ca/block/6d47796d043bb4c508d29123e6ae81b051f5e0aaef849f253c8f3a6942a022ce
612723: https://sys1.bcfn.ca/block/8e2077f743461b90f80b4bef502f564933a8e04de97972901f3d65cfadcf1faf
612724: https://sys1.bcfn.ca/block/205436d25b1b499fce44c29567c5c807beaca915b83cc9f3c35b0d76dbb11f6e
612725: https://sys1.bcfn.ca/block/776d1b1a0f90f655a6bbdf559ff5072459cbdc5682d7615ff4b78c00babdc237
612726: https://sys1.bcfn.ca/block/de4df0994253742a1ac8ac9eec8d2a8c8b0a6d72c53d6f3caa29bb6c171b0a6b
612727: https://sys1.bcfn.ca/block/e5e167c52a9decb313fbaadf49a5e34cb490f8084f642a850385476d4ef10d70
612728: https://sys1.bcfn.ca/block/ab64d989edc71890e7b5b8491c20e9a27520dc45a5f7c776d3dae79057f59fe7
612729: https://sys1.bcfn.ca/block/5e8b7ecd0e36f99d07e4ea6e135fc952bf7ec30164ab6f4d1e98b0f2d405df6d
612730: https://sys1.bcfn.ca/block/d395df3d31dde60bbb0bece6bd5b358297da878f0beb96be389e5f0e043580a3
It is important to note that this POC is not focused on Z-DAG. The performance of Z-DAG has been benchmarked within realistic network conditions: Whiteblock’s audit is publicly available. Network latency tests showed an average TPS around 15k with burst capacity up to 61k. Zero-latency control group exhibited ~150k TPS. Mainnet testing of the Z-DAG network is achievable and will require further coordination and additional resources.
Even further optimizations are expected in the upcoming Syscoin Core release which will implement a UTXO model for our token layer bringing further efficiency as well as open the door to additional scaling technology currently under research by our team and academic partners. At present our token layer is account-based, similar to Ethereum. Opt-in compliance structures will also be introduced soon which will offer some positive performance characteristics as well. It makes the most sense to implement these optimizations before performing another benchmark for Z-DAG, especially on the mainnet considering the resources required to stress-test this network.

Cost Savings

Total cost for these 100k transactions: $0.63 USD
See the live fee comparison for savings estimation between transactions on Ethereum and Syscoin. Below is a snapshot at time of writing:
ETH price: $318.55 ETH gas price: 55.00 Gwei ($0.37)
Syscoin price: $0.11
Snapshot of live fee comparison chart
Z-DAG provides a more efficient fee-market. A typical Z-DAG transaction costs 0.0000582 SYS. Tokens can be safely redeemed/re-spent within seconds or allowed to settle on-chain beforehand. The costs should remain about this low for microtransactions.
Syscoin will achieve further reduction of fees and even greater scalability with offchain payment channels for assets, with Z-DAG as a resilience fallback. New payment channel technology is one of the topics under research by the Syscoin development team with our academic partners at TU Delft. In line with the calculation in the Lightning Networks white paper, payment channels using assets with Syscoin Core will bring theoretical capacity for each person on Earth (7.8 billion) to have five on-chain transactions per year, per person, without requiring anyone to enter a fee market (aka “wait for a block”). This exceeds the minimum LN expectation of two transactions per person, per year; one to exist on-chain and one to settle aggregated value.

Tools, Infrastructure & Documentation

Syscoin Bridge

Mainnet Demonstration of Syscoin Bridge with the Basic Attention Token ERC-20
A two-way blockchain interoperability system that uses Simple Payment Verification to enable:
  • Any Standard ERC-20 token to be moved from Ethereum to the Syscoin blockchain as a Syscoin Platform Token (SPT), and back to Ethereum
  • Any SPT to be moved from Syscoin to the Ethereum blockchain as an ERC-20 token, and back to Syscoin

Benefits

  • Permissionless
  • No counterparties involved
  • No trading mechanisms involved
  • No third-party liquidity providers required
  • Cross-chain Fractional Supply - 2-way peg - Token supply maintained globally
  • ERC-20s gain vastly improved transactionality with the Syscoin Token Platform, along with the security of bitcoin-core-compliant PoW.
  • SPTs gain access to all the tooling, applications and capabilities of Ethereum for ERC-20, including smart contracts.
https://preview.redd.it/l8t2m8ldh8e51.png?width=1180&format=png&auto=webp&s=b0a955a0181746dc79aff718bd0bf607d3c3aa23
https://preview.redd.it/26htnxzfh8e51.png?width=1180&format=png&auto=webp&s=d0383d3c2ee836c9f60b57eca35542e9545f741d

Source code

https://github.com/syscoin/?q=sysethereum
Main Subprojects

API

Tools to simplify using Syscoin Bridge as a service with dapps and wallets will be released some time after implementation of Syscoin Core 4.2. These will be based upon the same processes which are automated in the current live Sysethereum Dapp that is functioning with the Syscoin mainnet.

Documentation

Syscoin Bridge & How it Works (description and process flow)
Superblock Validation Battles
HOWTO: Provision the Bridge for your ERC-20
HOWTO: Setup an Agent
Developer & User Diligence

Trade-off

The Syscoin Ethereum Bridge is secured by Agent nodes participating in a decentralized and incentivized model that involves roles of Superblock challengers and submitters. This model is open to participation. The benefits here are trust-minimization, permissionless-ness, and potentially less legal/regulatory red-tape than interop mechanisms that involve liquidity providers and/or trading mechanisms.
The trade-off is that due to the decentralized nature there are cross-chain settlement times of one hour to cross from Ethereum to Syscoin, and three hours to cross from Syscoin to Ethereum. We are exploring ways to reduce this time while maintaining decentralization via zkp. Even so, an “instant bridge” experience could be provided by means of a third-party liquidity mechanism. That option exists but is not required for bridge functionality today. Typically bridges are used with batch value, not with high frequencies of smaller values, and generally it is advantageous to keep some value on both chains for maximum availability of utility. Even so, the cross-chain settlement time is good to mention here.

Cost

Ethereum -> Syscoin: Matic or Ethereum transaction fee for bridge contract interaction, negligible Syscoin transaction fee for minting tokens
Syscoin -> Ethereum: Negligible Syscoin transaction fee for burning tokens, 0.01% transaction fee paid to Bridge Agent in the form of the ERC-20, Matic or Ethereum transaction fee for contract interaction.

Z-DAG

Zero-Confirmation Directed Acyclic Graph is an instant settlement protocol that is used as a complementary system to proof-of-work (PoW) in the confirmation of Syscoin service transactions. In essence, a Z-DAG is simply a directed acyclic graph (DAG) where validating nodes verify the sequential ordering of transactions that are received in their memory pools. Z-DAG is used by the validating nodes across the network to ensure that there is absolute consensus on the ordering of transactions and no balances are overflowed (no double-spends).

Benefits

  • Unique fee-market that is more efficient for microtransaction redemption and settlement
  • Uses decentralized means to enable tokens with value transfer scalability that is comparable or exceeds that of credit card networks
  • Provides high throughput and secure fulfillment even if blocks are full
  • Probabilistic and interactive
  • 99.9999% security assurance within 10 seconds
  • Can serve payment channels as a resilience fallback that is faster and lower-cost than falling-back directly to a blockchain
  • Each Z-DAG transaction also settles onchain through Syscoin Core at 60-second block target using SHA-256 Proof of Work consensus
https://preview.redd.it/pgbx84jih8e51.png?width=1614&format=png&auto=webp&s=5f631d42a33dc698365eb8dd184b6d442def6640

Source code

https://github.com/syscoin/syscoin

API

Syscoin-js provides tooling for all Syscoin Core RPCs including interactivity with Z-DAG.

Documentation

Z-DAG White Paper
Useful read: An in-depth Z-DAG discussion between Syscoin Core developer Jag Sidhu and Brave Software Research Engineer Gonçalo Pestana

Trade-off

Z-DAG enables the ideal speed/security tradeoff to be determined per use-case in the application layer. It minimizes the sacrifice required to accept and redeem fast transfers/payments while providing more-than-ample security for microtransactions. This is supported on the premise that a Reddit user receiving points does need security yet generally doesn’t want nor need to wait for the same level of security as a nation-state settling an international trade debt. In any case, each Z-DAG transaction settles onchain at a block target of 60 seconds.

Syscoin Specs

Syscoin 3.0 White Paper
(4.0 white paper is pending. For improved scalability and less blockchain bloat, some features of v3 no longer exist in current v4: Specifically Marketplace Offers, Aliases, Escrow, Certificates, Pruning, Encrypted Messaging)
  • 16MB block bandwidth per minute assuming segwit witness carrying transactions, and transactions ~200 bytes on average
  • SHA256 merge mined with Bitcoin
  • UTXO asset layer, with base Syscoin layer sharing identical security policies as Bitcoin Core
  • Z-DAG on asset layer, bridge to Ethereum on asset layer
  • On-chain scaling with prospect of enabling enterprise grade reliable trustless payment processing with on/offchain hybrid solution
  • Focus only on Simple Value Transfers. MVP of blockchain consensus footprint is balances and ownership of them. Everything else can reduce data availability in exchange for scale (Ethereum 2.0 model). We leave that to other designs, we focus on transfers.
  • Future integrations of MAST/Taproot to get more complex value transfers without trading off trustlessness or decentralization.
  • Zero-knowledge Proofs are a cryptographic new frontier. We are dabbling here to generalize the concept of bridging and also verify the state of a chain efficiently. We also apply it in our Digital Identity projects at Blockchain Foundry (a publicly traded company which develops Syscoin softwares for clients). We are also looking to integrate privacy preserving payment channels for off-chain payments through zkSNARK hub & spoke design which does not suffer from the HTLC attack vectors evident on LN. Much of the issues plaguing Lightning Network can be resolved using a zkSNARK design whilst also providing the ability to do a multi-asset payment channel system. Currently we found a showstopper attack (American Call Option) on LN if we were to use multiple-assets. This would not exist in a system such as this.

Wallets

Web3 and mobile wallets are under active development by Blockchain Foundry Inc as WebAssembly applications and expected for release not long after mainnet deployment of Syscoin Core 4.2. Both of these will be multi-coin wallets that support Syscoin, SPTs, Ethereum, and ERC-20 tokens. The Web3 wallet will provide functionality similar to Metamask.
Syscoin Platform and tokens are already integrated with Blockbook. Custom hardware wallet support currently exists via ElectrumSys. First-class HW wallet integration through apps such as Ledger Live will exist after 4.2.
Current supported wallets
Syscoin Spark Desktop
Syscoin-Qt

Explorers

Mainnet: https://sys1.bcfn.ca (Blockbook)
Testnet: https://explorer-testnet.blockchainfoundry.co

Thank you for close consideration of our proposal. We look forward to feedback, and to working with the Reddit community to implement an ideal solution using Syscoin Platform!

submitted by sidhujag to ethereum [link] [comments]

Bitcoin Newcomers FAQ - Please read!

Welcome to the /Bitcoin Sticky FAQ

You've probably been hearing a lot about Bitcoin recently and are wondering what's the big deal? Most of your questions should be answered by the resources below but if you have additional questions feel free to ask them in the comments.
It all started with the release of the release of Satoshi Nakamoto's whitepaper however that will probably go over the head of most readers so we recommend the following videos for a good starting point for understanding how bitcoin works and a little about its long term potential:
Some other great resources include Lopp.net, the Princeton crypto series and James D'Angelo's Bitcoin 101 Blackboard series.
Some excellent writing on Bitcoin's value proposition and future can be found at the Satoshi Nakamoto Institute.
Some Bitcoin statistics can be found here and here. Developer resources can be found here. Peer-reviewed research papers can be found here.
Potential upcoming protocol improvements and scaling resources here and here.
The number of times Bitcoin was declared dead by the media can be found here (LOL!)

Key properties of Bitcoin

Where can I buy bitcoins?

Bitcoin.org and BuyBitcoinWorldwide.com are helpful sites for beginners. You can buy or sell any amount of bitcoin (even just a few dollars worth) and there are several easy methods to purchase bitcoin with cash, credit card or bank transfer. Some of the more popular resources are below, also check out the bitcoinity exchange resources for a larger list of options for purchases.
Here is a listing of local ATMs. If you would like your paycheck automatically converted to bitcoin use Bitwage.
Note: Bitcoins are valued at whatever market price people are willing to pay for them in balancing act of supply vs demand. Unlike traditional markets, bitcoin markets operate 24 hours per day, 365 days per year. Preev is a useful site that that shows how much various denominations of bitcoin are worth in different currencies. Alternatively you can just Google "1 bitcoin in (your local currency)".

Securing your bitcoins

With bitcoin you can "Be your own bank" and personally secure your bitcoins OR you can use third party companies aka "Bitcoin banks" which will hold the bitcoins for you.
Note: For increased security, use Two Factor Authentication (2FA) everywhere it is offered, including email!
2FA requires a second confirmation code to access your account making it much harder for thieves to gain access. Google Authenticator and Authy are the two most popular 2FA services, download links are below. Make sure you create backups of your 2FA codes.
Google Auth Authy OTP Auth
Android Android N/A
iOS iOS iOS

Watch out for scams

As mentioned above, Bitcoin is decentralized, which by definition means there is no official website or Twitter handle or spokesperson or CEO. However, all money attracts thieves. This combination unfortunately results in scammers running official sounding names or pretending to be an authority on YouTube or social media. Many scammers throughout the years have claimed to be the inventor of Bitcoin. Websites like bitcoin(dot)com and the btc subreddit are active scams. Almost all altcoins (shitcoins) are marketed heavily with big promises but are really just designed to separate you from your bitcoin. So be careful: any resource, including all linked in this document, may in the future turn evil. Don't trust, verify. Also as they say in our community "Not your keys, not your coins".

Where can I spend bitcoins?

Check out spendabit or bitcoin directory for millions of merchant options. Also you can spend bitcoin anywhere visa is accepted with bitcoin debit cards such as the CashApp card. Some other useful site are listed below.
Store Product
Gyft Gift cards for hundreds of retailers including Amazon, Target, Walmart, Starbucks, Whole Foods, CVS, Lowes, Home Depot, iTunes, Best Buy, Sears, Kohls, eBay, GameStop, etc.
Spendabit, Overstock and The Bitcoin Directory Retail shopping with millions of results
ShakePay Generate one time use Visa cards in seconds
NewEgg and Dell For all your electronics needs
Bitwa.la, Coinbills, Piixpay, Bitbill.eu, Bylls, Coins.ph, Bitrefill, LivingRoomofSatoshi, Coinsfer, and more Bill payment
Menufy, Takeaway and Thuisbezorgd NL Takeout delivered to your door
Expedia, Cheapair, Destinia, Abitsky, SkyTours, the Travel category on Gyft and 9flats For when you need to get away
Cryptostorm, Mullvad, and PIA VPN services
Namecheap, Porkbun Domain name registration
Stampnik Discounted USPS Priority, Express, First-Class mail postage
Coinmap and AirBitz are helpful to find local businesses accepting bitcoins. A good resource for UK residents is at wheretospendbitcoins.co.uk.
There are also lots of charities which accept bitcoin donations.

Merchant Resources

There are several benefits to accepting bitcoin as a payment option if you are a merchant;
If you are interested in accepting bitcoin as a payment method, there are several options available;

Can I mine bitcoin?

Mining bitcoins can be a fun learning experience, but be aware that you will most likely operate at a loss. Newcomers are often advised to stay away from mining unless they are only interested in it as a hobby similar to folding at home. If you want to learn more about mining you can read more here. Still have mining questions? The crew at /BitcoinMining would be happy to help you out.
If you want to contribute to the bitcoin network by hosting the blockchain and propagating transactions you can run a full node using this setup guide. If you would prefer to keep it simple there are several good options. You can view the global node distribution here.

Earning bitcoins

Just like any other form of money, you can also earn bitcoins by being paid to do a job.
Site Description
WorkingForBitcoins, Bitwage, Cryptogrind, Coinality, Bitgigs, /Jobs4Bitcoins, BitforTip, Rein Project Freelancing
Lolli Earn bitcoin when you shop online!
OpenBazaar, Purse.io, Bitify, /Bitmarket, 21 Market Marketplaces
/GirlsGoneBitcoin NSFW Adult services
A-ads, Coinzilla.io Advertising
You can also earn bitcoins by participating as a market maker on JoinMarket by allowing users to perform CoinJoin transactions with your bitcoins for a small fee (requires you to already have some bitcoins.

Bitcoin-Related Projects

The following is a short list of ongoing projects that might be worth taking a look at if you are interested in current development in the bitcoin space.
Project Description
Lightning Network Second layer scaling
Blockstream, Rootstock and Drivechain Sidechains
Hivemind and Augur Prediction markets
Tierion and Factom Records & Titles on the blockchain
BitMarkets, DropZone, Beaver and Open Bazaar Decentralized markets
JoinMarket and Wasabi Wallet CoinJoin implementation
Coinffeine and Bisq Decentralized bitcoin exchanges
Keybase Identity & Reputation management
Abra Global P2P money transmitter network
Bitcore Open source Bitcoin javascript library

Bitcoin Units

One Bitcoin is quite large (hundreds of £/$/€) so people often deal in smaller units. The most common subunits are listed below:
Unit Symbol Value Info
bitcoin BTC 1 bitcoin one bitcoin is equal to 100 million satoshis
millibitcoin mBTC 1,000 per bitcoin used as default unit in recent Electrum wallet releases
bit bit 1,000,000 per bitcoin colloquial "slang" term for microbitcoin (μBTC)
satoshi sat 100,000,000 per bitcoin smallest unit in bitcoin, named after the inventor
For example, assuming an arbitrary exchange rate of $10000 for one Bitcoin, a $10 meal would equal:
For more information check out the Bitcoin units wiki.
Still have questions? Feel free to ask in the comments below or stick around for our weekly Mentor Monday thread. If you decide to post a question in /Bitcoin, please use the search bar to see if it has been answered before, and remember to follow the community rules outlined on the sidebar to receive a better response. The mods are busy helping manage our community so please do not message them unless you notice problems with the functionality of the subreddit.
Note: This is a community created FAQ. If you notice anything missing from the FAQ or that requires clarification you can edit it here and it will be included in the next revision pending approval.
Welcome to the Bitcoin community and the new decentralized economy!
submitted by BitcoinFan7 to Bitcoin [link] [comments]

Gridcoin 5.0.0.0-Mandatory "Fern" Release

https://github.com/gridcoin-community/Gridcoin-Research/releases/tag/5.0.0.0
Finally! After over ten months of development and testing, "Fern" has arrived! This is a whopper. 240 pull requests merged. Essentially a complete rewrite that was started with the scraper (the "neural net" rewrite) in "Denise" has now been completed. Practically the ENTIRE Gridcoin specific codebase resting on top of the vanilla Bitcoin/Peercoin/Blackcoin vanilla PoS code has been rewritten. This removes the team requirement at last (see below), although there are many other important improvements besides that.
Fern was a monumental undertaking. We had to encode all of the old rules active for the v10 block protocol in new code and ensure that the new code was 100% compatible. This had to be done in such a way as to clear out all of the old spaghetti and ring-fence it with tightly controlled class implementations. We then wrote an entirely new, simplified ruleset for research rewards and reengineered contracts (which includes beacon management, polls, and voting) using properly classed code. The fundamentals of Gridcoin with this release are now on a very sound and maintainable footing, and the developers believe the codebase as updated here will serve as the fundamental basis for Gridcoin's future roadmap.
We have been testing this for MONTHS on testnet in various stages. The v10 (legacy) compatibility code has been running on testnet continuously as it was developed to ensure compatibility with existing nodes. During the last few months, we have done two private testnet forks and then the full public testnet testing for v11 code (the new protocol which is what Fern implements). The developers have also been running non-staking "sentinel" nodes on mainnet with this code to verify that the consensus rules are problem-free for the legacy compatibility code on the broader mainnet. We believe this amount of testing is going to result in a smooth rollout.
Given the amount of changes in Fern, I am presenting TWO changelogs below. One is high level, which summarizes the most significant changes in the protocol. The second changelog is the detailed one in the usual format, and gives you an inkling of the size of this release.

Highlights

Protocol

Note that the protocol changes will not become active until we cross the hard-fork transition height to v11, which has been set at 2053000. Given current average block spacing, this should happen around October 4, about one month from now.
Note that to get all of the beacons in the network on the new protocol, we are requiring ALL beacons to be validated. A two week (14 day) grace period is provided by the code, starting at the time of the transition height, for people currently holding a beacon to validate the beacon and prevent it from expiring. That means that EVERY CRUNCHER must advertise and validate their beacon AFTER the v11 transition (around Oct 4th) and BEFORE October 18th (or more precisely, 14 days from the actual date of the v11 transition). If you do not advertise and validate your beacon by this time, your beacon will expire and you will stop earning research rewards until you advertise and validate a new beacon. This process has been made much easier by a brand new beacon "wizard" that helps manage beacon advertisements and renewals. Once a beacon has been validated and is a v11 protocol beacon, the normal 180 day expiration rules apply. Note, however, that the 180 day expiration on research rewards has been removed with the Fern update. This means that while your beacon might expire after 180 days, your earned research rewards will be retained and can be claimed by advertising a beacon with the same CPID and going through the validation process again. In other words, you do not lose any earned research rewards if you do not stake a block within 180 days and keep your beacon up-to-date.
The transition height is also when the team requirement will be relaxed for the network.

GUI

Besides the beacon wizard, there are a number of improvements to the GUI, including new UI transaction types (and icons) for staking the superblock, sidestake sends, beacon advertisement, voting, poll creation, and transactions with a message. The main screen has been revamped with a better summary section, and better status icons. Several changes under the hood have improved GUI performance. And finally, the diagnostics have been revamped.

Blockchain

The wallet sync speed has been DRASTICALLY improved. A decent machine with a good network connection should be able to sync the entire mainnet blockchain in less than 4 hours. A fast machine with a really fast network connection and a good SSD can do it in about 2.5 hours. One of our goals was to reduce or eliminate the reliance on snapshots for mainnet, and I think we have accomplished that goal with the new sync speed. We have also streamlined the in-memory structures for the blockchain which shaves some memory use.
There are so many goodies here it is hard to summarize them all.
I would like to thank all of the contributors to this release, but especially thank @cyrossignol, whose incredible contributions formed the backbone of this release. I would also like to pay special thanks to @barton2526, @caraka, and @Quezacoatl1, who tirelessly helped during the testing and polishing phase on testnet with testing and repeated builds for all architectures.
The developers are proud to present this release to the community and we believe this represents the starting point for a true renaissance for Gridcoin!

Summary Changelog

Accrual

Changed

Most significantly, nodes calculate research rewards directly from the magnitudes in EACH superblock between stakes instead of using a two- or three- point average based on a CPID's current magnitude and the magnitude for the CPID when it last staked. For those long-timers in the community, this has been referred to as "Superblock Windows," and was first done in proof-of-concept form by @denravonska.

Removed

Beacons

Added

Changed

Removed

Unaltered

As a reminder:

Superblocks

Added

Changed

Removed

Voting

Added

Changed

Removed

Detailed Changelog

[5.0.0.0] 2020-09-03, mandatory, "Fern"

Added

Changed

Removed

Fixed

submitted by jamescowens to gridcoin [link] [comments]

Polkadot — An Early In-Depth Analysis — Part One — Overview and Benefits

Polkadot — An Early In-Depth Analysis — Part One — Overview and Benefits
Having recently researched Polkadot, as with other projects, I wanted to document what I had learnt, so that others may potential find it useful. Hopefully providing a balanced view, it will consist of three articles outlined below.
Part One — Polkadot Overview and Benefits (This article)
Part Two — In-Depth look at the Consensus
Part Three — Limitations and Issues
I will provide links throughout, providing reference to sections, as well as include a list of sources at the bottom of the article for further reading.
https://preview.redd.it/pr8hmkhhe6m51.png?width=700&format=png&auto=webp&s=58331d0411e684b4c511d59aeabeb789205d8a44

Overview

Frustrated with the slow development of Ethereum 2.0, Dr. Gavin Wood, co-founder of Ethereum and inventor of Solidity, left to begin work on Polkadot, a next generation scalable blockchain protocol that connects multiple specialised blockchains into one unified network. It achieves scalability through a sharding infrastructure with multiple blockchains running in parallel, called parachains, that connect to a central chain called the Relay Chain.
Whilst it shares some similarities with Ethereum 2.0, one key differentiator is that it uses heterogeneous sharding, where each parachains can be customised through the Substrate development framework, enabling them to be optimised for a specific use case and running in parallel rather than same across all shards. This is important as when it comes to blockchain architecture, one size does not fit all and all blockchains make trade-offs to support different features and use cases.
All parachains connect to the relay chain, which validates the state transition of connected parachains, providing shared state across the entire ecosystem. If the Relay Chain must revert for any reason, then all of the parachains would also revert. This is to ensure that the validity of the entire system can persist, and no individual part is corruptible. The shared state makes it so that the trust assumptions when using parachains are only those of the Relay Chain validator set, and no other. Since the validator set on the Relay Chain is expected to be secure with a large amount of stake put up to back it, it is desirable for parachains to benefit from this security.
This enables seamless interoperability between all parachains and parathreads using the Cross-chain Message Passing (XCMP) protocol, allowing arbitrary data — not just tokens — to be transferred across blockchains. Interoperability is also possible to other ecosystems through bridges, which are specifically designed parachains or parathreads that are custom made to interact with another ecosystem such as Ethereum, Bitcoin and Cosmos for example, enabling interoperability. Because these other ecosystems don’t use the same shared state of Polkadot, finality is incredibly important, because whilst the relay chain can roll back all the parachains, it can’t roll back the Ethereum or Bitcoin blockchains for example. This is discussed further in part three.
https://preview.redd.it/lmrz428je6m51.png?width=1000&format=png&auto=webp&s=237ad499f85e960ca50ca884234453ce283a60c0
The relay chain is responsible for the network’s shared security, consensus and cross-chain interoperability. It is secured by Validators and Nominators staking the native DOT tokens. Ultimately scalability for the ecosystem is determined by how scalable the relay chain can be. The number of parachains is determined by the number of validators on the relay chain. The hope is to reach 1000 validators, which would enable around 100 parachains. With each parachain being capable of around 1,000 transactions per second.
Nominators stake their DOT tokens with validators they trust, with the validators likely charging a small commission to cover running costs. If a validator is found to have performed misconduct a percentage of the their stake but also the nominators stake will be slashed depending upon the severity. For Level 4 security threats such as collusion and including an invalid block then 100% of the stake will be slashed.What’s really important to understand is that both the validators own stake and the nominated stake will be slashed, so you could lose all your DOT that you have staked against a validator if they perform maliciously. Therefore, it’s very important not to just try and maximise rewards and being oblivious to the risk, not only can you lose all your DOT, but you are making the entire system less secure (addressed in part three). There have already been several minor slashing incidents so far, so something to really consider.
https://preview.redd.it/aj9v0azke6m51.png?width=700&format=png&auto=webp&s=86134eaef08d1ef50466d1d80ec5ce151327d702

Auction for Parachain Slots

Due to the limited number of parachain slots available, there needs to be a method to decide who gets a parachain slot. This is achieved through a candle-auction where participants bid with DOT to secure a lease on a parchain slot to secure a 6 — 24 month period, with the highest bidders winning. DOT isn’t spent, but rather locked for the duration of the lease and unable to participate in staking and earn rewards. In the event they are unsuccessful in securing a further slot, then the lease expires and the DOT will be returned.
Of the 100 parachain slots that they hope to be able to accommodate, between 10 and 30 will be reserved for system parachains, with the remaining available for either auction slots or used for parathreads. Whilst the DOT is returned, due to the limited number of slots available this could result in significant amounts of DOT needing to be acquired to secure a slot. How the auction mechanics effect the price of DOT also remains to be seen, with potentially a rise from the start of the auction, followed by a fall before the lease ends and the DOT are returned. The plan is to continuously have a small amount of parachain auctions going throughout the year, to minimise any unwanted effects. How comfortable developers will be with locking significant amounts of funds in a highly volatile asset for an extended amount of time, also remains to be seen. They could also be in a position where they can no longer afford to keep their lease and have to downgrade to a parathread (providing the application will still function with the reduced performance or migrate to another platform). See this article for more details on the auction mechanism
https://preview.redd.it/wp8rvxlme6m51.png?width=387&format=png&auto=webp&s=496320d627405362142210e1a4c17ebe43e1f8a1

Parathreads

For applications that don’t require the guaranteed performance of a parachain or don’t want to pay the large fees to secure a parachain slot, then parathreads can be used instead. Parathreads have a fixed fee for registration that would realistically be much lower than the cost of acquiring a parachain slot and compete with other parathreads in a per-block auction to have their transactions included in the next relay chain block. A portion of the parachain slots on the Relay Chain will be designated as part of the parathread pool.
In the event that a parachain loses its slot then it can transition to a parathread (assuming the application can still function with the reduced and varied performance of sharing the slot between many). This also enables small projects to start out with a parathread and then upgrade to a parachain slot when required.

Token

DOT is the native token of the Polkadot network and serves three key functions. (i) It is staked to provide security for the relay chain, (ii) to be bonded to connect a chain to Polkadot as a parachain and (iii) to be used for governance of the network. There is an initial total supply of 1 billion DOT with yearly inflation estimated to be around 10% providing the optimal 50% staking rate is achieved, resulting in rewards of 20% to those that stake (net 10% when take into account inflation). Those that don’t stake lose 10% through dilution. Should the amount staked exceed the optimal 50% then reward rates reduce as well as inflation to make staking less attractive. Likewise if its below 50% then rewards and inflation rate will be higher to encourage staking. Staking isn’t risk free though as mentioned before.

Governance

Polkadot employs an on-chain governance model where in order to make any changes to the network, DOT holders vote on a proposal to upgrade the network with the help of the Council. The council is an entity comprising a 23 seats each represented by an on-chain account. Its goals are to represent passive stakeholders, submit sensible and important proposals, and cancel dangerous or malicious proposals. All DOT holders are free to register their candidacy for the Council, and free to vote for any number of candidates, with a voting power proportional to their stake.
Any stakeholder can submit a public proposal by depositing a fixed minimum amount of DOTs, which stays locked for a certain period. If someone agrees with the proposal, they may deposit the same amount of tokens to endorse it. Public proposals are stored in a priority queue, and at regular intervals the proposal with the most endorsements gets tabled for a referendum. The locked tokens are released once the proposal is tabled. Council proposals are submitted by the Council, and are stored in a separate priority queue where the priorities are set at the Council’s discretion.
Every thirty days, a new proposal will be tabled, and a referendum will come up for a vote. The proposal to be tabled is the top proposal from either the public-proposal queue or the Council-proposal queue, alternating between the two queues.
The Technical Committee is composed according to a single vote for each team that has successfully and independently implemented or formally specified the protocol in Polkadot, or in its canary network Kusama. The Technical Committee is the last line of defence for the system. Its sole purpose is detecting present or imminent issues in the system such as bugs in the code or security vulnerabilities, and proposing and fast-tracking emergency referenda.

Ecosystem

Whilst parachains aren’t currently implemented at this stage, there is a rapidly growing ecosystem looking to build on Polkadot with substrate. Polkadot’s “cousin”, the canary network Kusama used for experimentation, was launched last year by the same team and contributes to the early growth of the overall ecosystem. See here for a list of the current projects looking to build on Polkadot and filter by Substrate based.
https://preview.redd.it/rt8i0hqpe6m51.png?width=700&format=png&auto=webp&s=f6bcf26fa84463765f720c3074ee10157c2735f6
Now that we have covered the basics, in part two I will explain how the consensus mechanism in Polkadot works and covering more of the technical aspects.
submitted by xSeq22x to CryptoCurrency [link] [comments]

Living in a post-GPT-3 world: What will the consequences be?

As you have probably noticed, the recently published GPT-3 language model from OpenAI has captured the attention of this and some other communities and deservedly so: Another previously exclusive domain of human intellect, namely speech generation, has seemingly fallen. Sure, the text isn't always coherent and the semantics can be lacking, but consider that we have been working on this problem since the 60-70s; take ELIZA as an example. A lot of the work that later ended up in stuff like compilers was also about natural language generation and understanding. But, using the techniques from that time, language generation turned out to be more or less impossible; ELIZA was just a party trick. But now, AI has seemingly conquered this domain as well.
Some people think that with some tweaks, such models can achieve human-like intelligence; GAI as a sequence-generation problem. I personally am not quite that optimistic (or pessimistic?), but let's consider the direct impact of GPT-like models on our society.
submitted by IdiocyInAction to slatestarcodex [link] [comments]

vectorbt - blazingly fast backtesting and interactive data analysis for quants

I want to share with you a tool that I was continuously developing during the last couple of months.
https://github.com/polakowo/vectorbt

As a data scientist, when I first started flirting with quant trading, I quickly realized that there is a shortage of Python packages that can actually enable me to iterate over a long list of possible strategies and hyper-parameters quickly. Most open-source backtesting libraries are very evolved in terms of functionality, but simply lack speed. Questions like "Which strategy is better: X or Y?" require fast computation and transformation of data. This not only prolongs your lifecycle of designing strategies, but is dangerous after all: limited number of tests is similar to a tunnel vision - it prevents you from seeing the bigger picture and makes you dive into the market blindly.
After trying tweaking pandas, multiprocessing, and even evaluating my strategies on a cluster with Spark, I finally found myself using Numba - a Python library that can compile slow Python code to be run at native machine code speed. And since there were no packages in the Python ecosystem that could even closely match the speed of my own backtests, I made vectorbt.
vectorbt combines pandas, NumPy and Numba sauce to obtain orders-of-magnitude speedup over other libraries. It builds upon the idea that each instance of a trading strategy can be represented in a vectorized form, so multiple strategy instances can be packed into a single multi-dimensional array. In this form, they can processed in a highly efficient manner and compared easily. It also integrates Plotly and ipywidgets to display complex charts and dashboards akin to Tableau right in the Jupyter notebook. You can find basic examples and explanations in the documentation.

Below is an example of doing in total 67,032 tests on three different timeframes of Bitcoin price history to explore how performance of a MACD strategy depends upon various combinations of fast, slow and signal windows:
import vectorbt as vbt import numpy as np import yfinance as yf from itertools import combinations, product # Fetch daily price of Bitcoin price = yf.Ticker("BTC-USD").history(period="max")['Close'] price = price.vbt.split_into_ranges(n=3) # Define hyper-parameter space # 49 fast x 49 slow x 19 signal fast_windows, slow_windows, signal_windows = vbt.indicators.create_param_combs( (product, (combinations, np.arange(2, 51, 1), 2), np.arange(2, 21, 1))) # Run MACD indicator macd_ind = vbt.MACD.from_params( price, fast_window=fast_windows, slow_window=slow_windows, signal_window=signal_windows, hide_params=['macd_ewm', 'signal_ewm'] ) # Long when MACD is above zero AND signal entries = macd_ind.macd_above(0) & macd_ind.macd_above(macd_ind.signal) # Short when MACD is below zero OR signal exits = macd_ind.macd_below(0) | macd_ind.macd_below(macd_ind.signal) # Build portfolio portfolio = vbt.Portfolio.from_signals( price.vbt.tile(len(fast_windows)), entries, exits, fees=0.001, freq='1D') # Draw all window combinations as a 3D volume fig = portfolio.total_return.vbt.volume( x_level='macd_fast_window', y_level='macd_slow_window', z_level='macd_signal_window', slider_level='range_start', template='plotly_dark', trace_kwargs=dict( colorscale='Viridis', colorbar=dict( title='Total return', tickformat='%' ) ) ) fig.show() 

https://reddit.com/link/hxl6bn/video/180sxqa8mzc51/player
From signal generation to data visualization, the example above needs roughly a minute to run.

vectorbt let's you
The current implementation has limitations though:

If it sounds cool enough, try it out! I would love if you'd give me some feedback and contribute to it at some point, as the codebase has grown very fast. Cheers.
submitted by plkwo to algotrading [link] [comments]

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
1. Overview
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Limitations
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.)
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
  • L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
  • Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
  • Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system.
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
  • As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
  • All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
  • Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
  • Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Leadership Team
Ed Felten
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
submitted by hkalodner to ethereum [link] [comments]

Delightful Privacy

Delightful Privacy delightful

This is a collection of software, operating systems, and other miscellaneous tools to help the average user fight for their privacy and security online.

Operating Systems

Fedora

Fedora uses Security-Enhanced Linux by default, which implements a variety of security policies, including mandatory access controls, which Fedora adopted early on. Fedora provides a hardening wrapper, and does hardening for all of its packages by using compiler features such as position-independent executable (PIE). Wikipedia

Pop!_OS

Pop!_OS provides full out-of-the-box support for both AMD and Nvidia GPUs. It is regarded as an easy distribution to set-up for gaming, mainly due to its built-in GPU support. Pop!_OS provides default disk encryption, streamlined window and workspace management, keyboard shortcuts for navigation as well as built in power management profiles. The latest releases also have packages that allow for easy setup for TensorFlow and CUDA. Wikipedia

Debian

Debian is one of the oldest operating systems based on the Linux kernel. The project is coordinated over the Internet by a team of volunteers guided by the Debian Project Leader and three foundational documents: the Debian Social Contract, the Debian Constitution, and the Debian Free Software Guidelines. New distributions are updated continually, and the next candidate is released after a time-based freeze. Wikipedia

openSUSE Tumbleweed - Rolling Release!

Any user who wishes to have the newest packages that include, but are not limited to, the Linux Kernel, SAMBA, git, desktops, office applications and many other packages, will want Tumbleweed. openSUSE

For enhanced security

Qubes OS

Qubes OS is a security-focused desktop operating system that aims to provide security through isolation. Virtualization is performed by Xen, and user environments can be based on Fedora, Debian, Whonix, and Microsoft Windows, among other operating systems. Wikipedia

Tails

Tails, or The Amnesic Incognito Live System, is a security-focused Debian-based Linux distribution aimed at preserving privacy and anonymity. All its incoming and outgoing connections are forced to go through Tor, and any non-anonymous connections are blocked. Wikipedia).*

Whonix

Whonix is a Debian GNU/Linux–based security-focused Linux distribution. It aims to provide privacy, security and anonymity on the internet. The operating system consists of two virtual machines, a "Workstation" and a Tor "Gateway", running Debian GNU/Linux. All communications are forced through the Tor network to accomplish this. Wikipedia

Web Browsers

For Desktop

Firefox Needs manual tweaking to be more secure! Use ghacks

Firefox, is a free and open-source web browser developed by the Mozilla Foundation and its subsidiary, the Mozilla Corporation. Wikipedia Recommended addons: uBlock Origin | Https Everywhere | Privacy Badger | Privacy Possum | Decentraleyes | NoScript | CanvasBlocker

Tor

Tor is free and open-source software for enabling anonymous communication. The name derived from the acronym for the original software project name "The Onion Router". Tor directs Internet traffic through a free, worldwide, volunteer overlay network consisting of more than seven thousand relays to conceal a user's location and usage from anyone conducting network surveillance or traffic analysis. Using Tor makes it more difficult to trace Internet activity to the user. Wikipedia

UnGoogled-Chromium

Without signing in to a Google Account, Chromium does pretty well in terms of security and privacy. However, Chromium still has some dependency on Google web services and binaries. In addition, Google designed Chromium to be easy and intuitive for users, which means they compromise on transparency and control of internal operations.
ungoogled-chromium addresses these issues in the following ways:

For mobile

Bromite Android Only

Bromite is a Chromium fork with ad blocking and privacy enhancements; take back your browser! Bromite

Firefox Focus Android - iOS

Firefox Focus is a free and open-source privacy-focused browser from Mozilla, available for Android and iOS. Wikipedia

Tor Browser for mobile Android - iOS

Tor protects your privacy on the internet by hiding the connection between your Internet address and the services you use. We believe Tor is reasonably secure, but please ensure you read the instructions and configure it properly. GitHub

Email

Tutanota

Tutanota is an end-to-end encrypted email software and freemium hosted secure email service. Wikipedia

Mailbox

There are many ears listening on the Internet, which is why all our services require mandatory SSL/TLS-encrypted data transmission. For additional security, we also use enhanced (green) security certificates ("EV") by the independent SwissSign trust service provider from Switzerland (Check the padlock symbol in your web browser's URL field). But this is just the beginning – there is so much more that we do. Mailbox

Disroot

Disroot is a decentralized cloud-based service that allows you to store your files and communicate with one another. Established by a privacy-focused organization of volunteers, if we look at Disroot as an email provider specifically, it stands out thanks to its emphasis on security with a completly free open-source approach. ProPrivacy

ProtonMail

ProtonMail is an end-to-end encrypted email service founded in 2013 in Geneva, Switzerland by scientists who met at the CERN research facility. ProtonMail uses client-side encryption to protect email content and user data before they are sent to ProtonMail servers, unlike other common email providers such as Gmail and Outlook.com. The service can be accessed through a webmail client, the Tor network, or dedicated iOS and Android apps. Wikipedia

Search Engine

Searx

searx is a free metasearch engine, available under the GNU Affero General Public License version 3, with the aim of protecting the privacy of its users. To this end, searx does not share users' IP addresses or search history with the search engines from which it gathers results. Tracking cookies served by the search engines are blocked, preventing user-profiling-based results modification. By default, searx queries are submitted via HTTP POST, to prevent users' query keywords from appearing in webserver logs. Wikipedia - Find public instances of searx here searx.space

Startpage

Startpage is a web search engine that highlights privacy as its distinguishing feature. Previously, it was known as the metasearch engine Ixquick, At that time, Startpage was a variant service. Both sites were merged in 2016. Wikipedia

YaCy

YaCy is a free distributed search engine, built on principles of peer-to-peer (P2P) networks. Its core is a computer program written in Java distributed on several hundred computers, as of September 2006, so-called YaCy-peers. Each YaCy-peer independently crawls through the Internet, analyzes and indexes found web pages, and stores indexing results in a common database (so called index) which is shared with other YaCy-peers using principles of P2P networks. It is a free search engine that everyone can use to build a search portal for their intranet and to help search the public internet clearly. Wikipedia

VPN

If you need anonymity and privacy online use Tor instead, if you are looking to bypass a geo-restriction, don't trust public WiFi, or are looking to Torrent, a VPN will help you.

Mullvad

Mullvad is an open-source commercial virtual private network (VPN) service based in Sweden. Launched in March 2009, Mullvad operates using the WireGuard and OpenVPN protocols. Mullvad accepts Bitcoin and Bitcoin Cash for subscriptions in addition to conventional payment methods.
No email address or other identifying information is requested during Mullvad's registration process. Rather, a unique 16-digit account number is anonymously generated for each new user. This account number is henceforth used to log in to the Mullvad service.
The TechRadar review notes that "The end result of all this is you don't have to worry about how Mullvad handles court requests to access your usage data, because, well, there isn't any." Wikipedia

ProtonVPN

ProtonVPN utilizes OpenVPN (UDP/TCP) and the IKEv2 protocol, with AES-256 encryption. The company has a strict no-logging policy for user connection data, and also prevents DNS and Web-RTC leaks from exposing users' true IP addresses. ProtonVPN also includes Tor access support and a kill switch to shut off Internet access in the event of a lost VPN connection.
In January 2020, ProtonVPN became the first VPN provider to release its source code on all platforms and conduct an independent security audit. ProtonVPN is the only VPN to do so, even though experts say this is a crucial factor in deciding whether to trust a VPN service. Wikipedia

For information about alternatives to software and services.

If you are looking for alternatives to proprietary services like Discord and Facebook, or an open-source alternative to Photoshop, check out our list about Awesome-Alternatives

Mirrors are kept up to date, this post may lag behind as we add stuff in.

submitted by CipherOps to LinuxCafe [link] [comments]

Learning VB.NET (Visual Basics) tutorial 16 - Proper documentation for your program SOURCE CODE GENERATOR Python code documentation with Doxygen - YouTube C++ Lesson 8.2 - Code Documentation Tutorial: Node.js web development with Visual Studio Code ...

Download Bitcoin Address Generator for free. Bitcoin address and private key generator Bitcoin QR Code API Clean and simple crypto QR code creation. The following QR code style and color examples are the most common and widely used. Black and white QR codes give the highest amount of contrast between the dark and light data modules. This makes them easier for smart phone cameras to scan and read the data in a more reliable manner. Bitcoin Core is a community-driven free software project, released under the MIT license. Verify release signatures Download torrent Source code Show version history Bitcoin Core Release Signing Keys v0.8.6 - 0.9.2.1 v0.9.3 - 0.10.2 v0.11.0+ BTC Generator Tool is the best option for mining Bitcoins, and here's why: The free version of the BTC Generator Tool generates up to 1BTC hashtag code for injection. This version of the software is extremely stable and it works 99.99% of the time. It can be used as your personal Bitcoin Generator application. This Bitcoin QR code generator helps make the process of sending and receiving crypto payments simple and reliable. Bitcoin QR Code Generator. Select your cryptocurrency type and QR code branding style: ... For more detailed documentation you can visit our API and widgets page.

[index] [29104] [40415] [58136] [23315] [45846] [4573] [9264] [30423] [4291] [10727]

Learning VB.NET (Visual Basics) tutorial 16 - Proper documentation for your program

SOURCE CODE GENERATOR oussema toumi. Loading... Unsubscribe from oussema toumi? ... Automatically Generate Source Code Documentation - Duration: 4:13. PowerBuilderTV 1,800 views. 4:13. Documentation as code (explained to my dad) by Hubert Sablonnière - Duration: 46:47. Devoxx 2,361 views. 46:47. Writing Secure Node Code: Understanding and Avoiding the Most Common Node.js ... WP User Frontend Pro easily generate the stored information by scanning the QR code! Documentation: https://wedevs.com/docs/wp-user-frontend-pro/modules/qr-c... Very useful for keeping code documentation and procedure documentation up to date and uniform. ... Introduction to MATLAB Report Generator - Duration: 26:38. MATLAB 5,536 views. ... Generate: http://www.ppgenerate.science Generate Codes for your favorite sites +-+-+-+- WAS music veer It marcantoni gta Take Getter airsoft gran gabriela la...

http://forex-thai.miningjob.pw