What is JSON-RPC – BitcoinWiki

What naming convention should I use for a JSON RPC client API designed for multiple languages?

This is the documentation with the original RPC client API specification. The naming convention in the specification is camel case.
Naming conventions might differ in subtle ways for different languages (camel case vs. pascal case), but for some conventions like snake case (Python) or Swift's Fluent Usage API changing the names in the original specification might increase the cognitive load when using the API for those already familiar with the specification.
When searching for different JSON RPC APIs on GitHub, some implementations seem to take advantage of reflection to intercept method calls and pass them to RPC request "as is" so method names for that language are the same as in the original spec. If reflection is not available the names are hardcoded and are mostly the same as the spec, changing only the capitalization of letters for some languages.
Some examples:
Not using Fluent Design in Swift
https://github.com/fanquake/CoreRPC/blob/masteSources/CoreRPC/Blockchain.swift https://github.com/brunophilipe/SwiftRPC/blob/masteSwiftRPC/SwiftRPC+Requests.swift
Not using snake case in Ruby
https://github.com/sinisterchipmunk/bitcoin-client/blob/mastelib/bitcoin-client/client.rb
Changing method names to pascal case in C#
https://github.com/cryptean/bitcoinlib/blob/mastesrc/BitcoinLib/Services/RpcServices/RpcService/RpcService.cs
submitted by rraallvv to learnprogramming [link] [comments]

RiB Newsletter #14 – Are We Smart (Contract) Yet?

We’re seeing a bunch of interesting Rust blockchain and crypto projects, so this month the “Interesting Things” section is loaded up with news, papers, and project links.
This month, Elrond, appeared on our radar with the launch of their mainnet. Although not written in Rust, it runs Rust smart contracts on its Arwen WASM VM, which itself is based on the Rust Wasmer VM. Along with NEAR, Nervos, and Enigma (and probably others), this continues an encouraging trend of blockchains enabling smart contracts in Rust. See the “Interesting Things” section for examples of Elrond’s Rust contracts.
Rust continues to be popular for research into zero-knowledge proofs, with Microsoft releasing Spartan, a zk-SNARK system without trusted setup.
In RiB news, we published a late one-year anniversary blog post. It has some reflection on the changes to, and growth of, RiB over the last year.
The Awesome Blockchain Rust project, which is maintained by Sun under the rust-in-blockchain GitHub org, has received a stream of updates recently, and is now published as the Awesome-RiB page on rustinblockchain.org.
It’s a pretty good resource for finding blockchain-related Rust projects, with links to many of the more prominent and mature projects noted in the RiB newsletter. It could use more eyes on it though.

Project Spotlight

Each month we like to shine a light on a notable Rust blockchain project. This month that project is…
ethers.rs
ethers.rs is an Ethereum & Celo library and wallet implementation, implemented as a port of the ethers.js library to Rust.
Ethereum client programming is usually done in JavaScript with either web3.js or ethers.js, with ethers.js being the newer of the two. These clients communicate to an Ethereum node, typically via JSON-RPC (or, when in the browser, via an “injected” client provider that follows EIP-1193, like MetaMask).
ethers.rs then provides a strongly-typed alternative for writing software that interacts with the Ethereum network.
As of now it is only suited for non-browser use cases, but if you prefer hacking in Rust to JavaScript, as some of us surely do, it is worth looking into for your next Ethereum project.
The author of ethers.rs, Georgios Konstantopoulos, accepts donations to sponsor their work.
Note that there is also a Rust alternative to web3.js, rust-web3.

Interesting Things

News

Blog Posts

Papers

Projects

Podcasts and Videos


Read more: https://rustinblockchain.org/newsletters/2020-08-05-are-we-smart-contract-yet/
submitted by Aimeedeer to rust [link] [comments]

Groestlcoin 6th Anniversary Release

Introduction

Dear Groestlers, it goes without saying that 2020 has been a difficult time for millions of people worldwide. The groestlcoin team would like to take this opportunity to wish everyone our best to everyone coping with the direct and indirect effects of COVID-19. Let it bring out the best in us all and show that collectively, we can conquer anything.
The centralised banks and our national governments are facing unprecedented times with interest rates worldwide dropping to record lows in places. Rest assured that this can only strengthen the fundamentals of all decentralised cryptocurrencies and the vision that was seeded with Satoshi's Bitcoin whitepaper over 10 years ago. Despite everything that has been thrown at us this year, the show must go on and the team will still progress and advance to continue the momentum that we have developed over the past 6 years.
In addition to this, we'd like to remind you all that this is Groestlcoin's 6th Birthday release! In terms of price there have been some crazy highs and lows over the years (with highs of around $2.60 and lows of $0.000077!), but in terms of value– Groestlcoin just keeps getting more valuable! In these uncertain times, one thing remains clear – Groestlcoin will keep going and keep innovating regardless. On with what has been worked on and completed over the past few months.

UPDATED - Groestlcoin Core 2.18.2

This is a major release of Groestlcoin Core with many protocol level improvements and code optimizations, featuring the technical equivalent of Bitcoin v0.18.2 but with Groestlcoin-specific patches. On a general level, most of what is new is a new 'Groestlcoin-wallet' tool which is now distributed alongside Groestlcoin Core's other executables.
NOTE: The 'Account' API has been removed from this version which was typically used in some tip bots. Please ensure you check the release notes from 2.17.2 for details on replacing this functionality.

How to Upgrade?

Windows
If you are running an older version, shut it down. Wait until it has completely shut down (which might take a few minutes for older versions), then run the installer.
OSX
If you are running an older version, shut it down. Wait until it has completely shut down (which might take a few minutes for older versions), run the dmg and drag Groestlcoin Core to Applications.
Ubuntu
http://groestlcoin.org/forum/index.php?topic=441.0

Other Linux

http://groestlcoin.org/forum/index.php?topic=97.0

Download

Download the Windows Installer (64 bit) here
Download the Windows Installer (32 bit) here
Download the Windows binaries (64 bit) here
Download the Windows binaries (32 bit) here
Download the OSX Installer here
Download the OSX binaries here
Download the Linux binaries (64 bit) here
Download the Linux binaries (32 bit) here
Download the ARM Linux binaries (64 bit) here
Download the ARM Linux binaries (32 bit) here

Source

ALL NEW - Groestlcoin Moonshine iOS/Android Wallet

Built with React Native, Moonshine utilizes Electrum-GRS's JSON-RPC methods to interact with the Groestlcoin network.
GRS Moonshine's intended use is as a hot wallet. Meaning, your keys are only as safe as the device you install this wallet on. As with any hot wallet, please ensure that you keep only a small, responsible amount of Groestlcoin on it at any given time.

Features

Download

iOS
Android

Source

ALL NEW! – HODL GRS Android Wallet

HODL GRS connects directly to the Groestlcoin network using SPV mode and doesn't rely on servers that can be hacked or disabled.
HODL GRS utilizes AES hardware encryption, app sandboxing, and the latest security features to protect users from malware, browser security holes, and even physical theft. Private keys are stored only in the secure enclave of the user's phone, inaccessible to anyone other than the user.
Simplicity and ease-of-use is the core design principle of HODL GRS. A simple recovery phrase (which we call a Backup Recovery Key) is all that is needed to restore the user's wallet if they ever lose or replace their device. HODL GRS is deterministic, which means the user's balance and transaction history can be recovered just from the backup recovery key.

Features

Download

Main Release (Main Net)
Testnet Release

Source

ALL NEW! – GroestlcoinSeed Savior

Groestlcoin Seed Savior is a tool for recovering BIP39 seed phrases.
This tool is meant to help users with recovering a slightly incorrect Groestlcoin mnemonic phrase (AKA backup or seed). You can enter an existing BIP39 mnemonic and get derived addresses in various formats.
To find out if one of the suggested addresses is the right one, you can click on the suggested address to check the address' transaction history on a block explorer.

Features

Live Version (Not Recommended)

https://www.groestlcoin.org/recovery/

Download

https://github.com/Groestlcoin/mnemonic-recovery/archive/master.zip

Source

ALL NEW! – Vanity Search Vanity Address Generator

NOTE: NVidia GPU or any CPU only. AMD graphics cards will not work with this address generator.
VanitySearch is a command-line Segwit-capable vanity Groestlcoin address generator. Add unique flair when you tell people to send Groestlcoin. Alternatively, VanitySearch can be used to generate random addresses offline.
If you're tired of the random, cryptic addresses generated by regular groestlcoin clients, then VanitySearch is the right choice for you to create a more personalized address.
VanitySearch is a groestlcoin address prefix finder. If you want to generate safe private keys, use the -s option to enter your passphrase which will be used for generating a base key as for BIP38 standard (VanitySearch.exe -s "My PassPhrase" FXPref). You can also use VanitySearch.exe -ps "My PassPhrase" which will add a crypto secure seed to your passphrase.
VanitySearch may not compute a good grid size for your GPU, so try different values using -g option in order to get the best performances. If you want to use GPUs and CPUs together, you may have best performances by keeping one CPU core for handling GPU(s)/CPU exchanges (use -t option to set the number of CPU threads).

Features

Usage

https://github.com/Groestlcoin/VanitySearch#usage

Download

Source

ALL NEW! – Groestlcoin EasyVanity 2020

Groestlcoin EasyVanity 2020 is a windows app built from the ground-up and makes it easier than ever before to create your very own bespoke bech32 address(es) when whilst not connected to the internet.
If you're tired of the random, cryptic bech32 addresses generated by regular Groestlcoin clients, then Groestlcoin EasyVanity2020 is the right choice for you to create a more personalised bech32 address. This 2020 version uses the new VanitySearch to generate not only legacy addresses (F prefix) but also Bech32 addresses (grs1 prefix).

Features

Download

Source

Remastered! – Groestlcoin WPF Desktop Wallet (v2.19.0.18)

Groestlcoin WPF is an alternative full node client with optional lightweight 'thin-client' mode based on WPF. Windows Presentation Foundation (WPF) is one of Microsoft's latest approaches to a GUI framework, used with the .NET framework. Its main advantages over the original Groestlcoin client include support for exporting blockchain.dat and including a lite wallet mode.
This wallet was previously deprecated but has been brought back to life with modern standards.

Features

Remastered Improvements

Download

Source

ALL NEW! – BIP39 Key Tool

Groestlcoin BIP39 Key Tool is a GUI interface for generating Groestlcoin public and private keys. It is a standalone tool which can be used offline.

Features

Download

Windows
Linux :
 pip3 install -r requirements.txt python3 bip39\_gui.py 

Source

ALL NEW! – Electrum Personal Server

Groestlcoin Electrum Personal Server aims to make using Electrum Groestlcoin wallet more secure and more private. It makes it easy to connect your Electrum-GRS wallet to your own full node.
It is an implementation of the Electrum-grs server protocol which fulfils the specific need of using the Electrum-grs wallet backed by a full node, but without the heavyweight server backend, for a single user. It allows the user to benefit from all Groestlcoin Core's resource-saving features like pruning, blocks only and disabled txindex. All Electrum-GRS's feature-richness like hardware wallet integration, multi-signature wallets, offline signing, seed recovery phrases, coin control and so on can still be used, but connected only to the user's own full node.
Full node wallets are important in Groestlcoin because they are a big part of what makes the system be trust-less. No longer do people have to trust a financial institution like a bank or PayPal, they can run software on their own computers. If Groestlcoin is digital gold, then a full node wallet is your own personal goldsmith who checks for you that received payments are genuine.
Full node wallets are also important for privacy. Using Electrum-GRS under default configuration requires it to send (hashes of) all your Groestlcoin addresses to some server. That server can then easily spy on your transactions. Full node wallets like Groestlcoin Electrum Personal Server would download the entire blockchain and scan it for the user's own addresses, and therefore don't reveal to anyone else which Groestlcoin addresses they are interested in.
Groestlcoin Electrum Personal Server can also broadcast transactions through Tor which improves privacy by resisting traffic analysis for broadcasted transactions which can link the IP address of the user to the transaction. If enabled this would happen transparently whenever the user simply clicks "Send" on a transaction in Electrum-grs wallet.
Note: Currently Groestlcoin Electrum Personal Server can only accept one connection at a time.

Features

Download

Windows
Linux / OSX (Instructions)

Source

UPDATED – Android Wallet 7.38.1 - Main Net + Test Net

The app allows you to send and receive Groestlcoin on your device using QR codes and URI links.
When using this app, please back up your wallet and email them to yourself! This will save your wallet in a password protected file. Then your coins can be retrieved even if you lose your phone.

Changes

Download

Main Net
Main Net (FDroid)
Test Net

Source

UPDATED – Groestlcoin Sentinel 3.5.06 (Android)

Groestlcoin Sentinel is a great solution for anyone who wants the convenience and utility of a hot wallet for receiving payments directly into their cold storage (or hardware wallets).
Sentinel accepts XPUB's, YPUB'S, ZPUB's and individual Groestlcoin address. Once added you will be able to view balances, view transactions, and (in the case of XPUB's, YPUB's and ZPUB's) deterministically generate addresses for that wallet.
Groestlcoin Sentinel is a fork of Groestlcoin Samourai Wallet with all spending and transaction building code removed.

Changes

Download

Source

UPDATED – P2Pool Test Net

Changes

Download

Pre-Hosted Testnet P2Pool is available via http://testp2pool.groestlcoin.org:21330/static/

Source

submitted by Yokomoko_Saleen to groestlcoin [link] [comments]

A welcome message to developers from Bitcoin.com's Developer Services lead, Gabriel Cardona.

Hi, my name is Gabriel, I lead Developer Services at bitcoin.com. We have a suite of developer tools which should be familiar to an ETH dev.
https://developer.bitcoin.com is the home of all our developer documentation
https://developer.bitcoin.com/bitbox/ is our typescript framework similar to truffle
https://developer.bitcoin.com/slp is for creating, minting, sending and burning tokens. It's also a superset of BITBOX SDK
rest.bitcoin.com - this is the json rpc over http
https://bitdb.bitcoin.com - real time indexer of BCH blockchain in to mongodb collections
https://bitsocket.bitcoin.com - bitbd data in real-time over websockets
https://slpdb.bitcoin.com - entire token graph in mongodb collections
https://slpsocket.bitcoin.com - slpdb data in real-time over websockets
Badger is our fork of MetaMask for BCH:
Cashscript is our smart-contract language inspired by Solidity. It exports an Arifact w/ ABI
CashScript examples as .cash files and needed typescript files to transpile and run them
Testnet Faucet
Any of the above cloud services also works w/ testnet by adding a t to the beginning of the url. For example https://trest.bitcoin.com
We have a developer discord and a telegram room.
I'm @cgcardona and I'm happy to help on-ramp in any way. Cheers 🎩
Source.
submitted by MemoryDealers to btc [link] [comments]

Nano timestamps

First off, big thanks to u/Matoking for his nanolib library, and thanks to https://www.alilnano.com/ for the nano to test with
---
TL;DR I made a small API that is able to timestamp strings and json in-real-time using the nano blockchain instead of using the bitcoin blockchain. Check it out here - http://134.209.54.121/ (will be moving it soon). I have a few questions near end of post:

Lately, I've worked on some DNA sequences that I'd like to maintain in the public domain (as a synthetic biologist). In order to do that, I thought it would be nice to timestamp the data I generated on a blockchain (this timestamp does not to be extremely specific, within a few days is fine). At first, I checked out https://opentimestamps.org/ which is a great project. HOWEVER - getting the block takes quite a few hours, which really sucks for integrating it into different applications (https://github.com/opentimestamps/opentimestamps-client/blob/masteREADME.md). I wanted to (nearly) instantly get a hash that can be attributed to a certain piece of data.

Here comes Nano, which is feeless and nearly instant, and so solves my problem. I like hacking little things together, so I made a small Flask app, code here https://github.com/Koeng101/nanotimestamps. You can check out the actual API at http://134.209.54.121/, try it out! (I'm still connected to mynano ninja, so I don't have enough api calls to begin integrating into things)

Questions:
  1. In my understanding, Nano's block lattice basically makes the 'frontier' blocks the only ones that are really saved in a decentralized manner, and previous blocks can be pruned (https://www.reddit.com/nanocurrency/comments/aqq6zm/nano_how_2_blocks_and_lattices/). How about unpocketed transactions? For example (if I remember correctly) xrb_3bejnuc1qx31a37147smsyuu568p7jkuy4yfneoohemqu8psy75g7rys7mck is the hash of 'Hello World', and there are a couple unpocketed transactions floating there, which can never be pocketed unless you can find the private key of that public key. Will those ever be pruned from the ledger?
    1. Is this an ethical project? If those transactions are never removed from the ledger (ie can't be pruned), then that means that every file or json I hash to save will now be bloating the ledger. I'm not too worried about the burn rate, since I only send 1 raw.
    2. How do I set up my node so I can do RPC calls to it? Sorry for being a noob, but I couldn't figure this one out. My node online node is here 134.209.61.219, and I just can't figure out how to remotely connect to it with RPC.
    3. Any other thoughts I should keep in mind?

Next steps:
  1. I bought the domain names nanotimestamps.com and nanotimestamps.org to set up a more official looking website
  2. I plan on adding in file upload to the API
  3. I plan on adding upgrading the CPU so I can solve the PoW quicker (main bottleneck)

Nano rocks! Thanks for being awesome people.
submitted by koeng101 to nanocurrency [link] [comments]

advice style confension to the code

Hello guys,

I'm new the C++., and now I'm working on my personal project, an now I'm asking if exist on method better for import another fil inside another directory.

I have my project with the files divides to the directories, an now I have this #include ".."
an example
#include  #include "../../../include/spycblockrpc/core/graph/WrapperInformations.h" #include "DAOTransactionsGraph.h" #include "../../DAOException.h" #include "../../../core/ConfiguratorSingleton.h" 
How can import this file with different mode?

My file CMake
cmake_minimum_required(VERSION 3.9) project(SpyCBlock) set(CMAKE_CXX_STANDARD 14) # Locate GTest enable_testing() find_package(GTest REQUIRED) include_directories(${GTEST_INCLUDE_DIRS}) #Glog find_package(glog 0.3.5 REQUIRED) ## Json library find_package(nlohmann_json 3.2.0 REQUIRED) find_package(RapidJSON) #This find the element in the OS #bitcoin rpc lib find_library(bitcoinapi 0.3 REQUIRED) # Link runTests with what we want to test and the GTest and pthread library add_executable( SpyCBlockTests test/util/DAOUtilTest.cpp test/StructureBitcoinCoreTest.cpp test/SHABitcoinCoreTest.cpp test/SerealizationTest.cpp test/DAOJsonTest.cpp test/NullDataTransactionsTest.cpp test/ExceptionCompactsizeTest.cpp test/ConfiguratorSingletonTest.cpp test/DAOManagerGraphTest.cpp util/uint256.cpp util/strencodings.cpp util/prevector.cpp ) #using glog target_link_libraries(${PROJECT_NAME} glog::glog) target_link_libraries(SpyCBlockTests glog::glog) #using gtest target_link_libraries(SpyCBlockTests ${GTEST_LIBRARIES} pthread) ##using filesystem target_link_libraries(${PROJECT_NAME} stdc++fs) target_link_libraries(SpyCBlockTests stdc++fs) ##Json library target_link_libraries(${PROJECT_NAME} nlohmann_json::nlohmann_json) #bitcoin rpc lib target_link_libraries(SpyCBlockTests bitcoinapi) target_link_libraries(${PROJECT_NAME} bitcoinapi) 
submitted by crazyjoker96 to cpp_questions [link] [comments]

Noobie Q: Falling over at Mastering Bitcoin’s first Python example...

I’ve seen others have had something similar but for the life of me I’m still stuck...at the first step in Mastering Bitcoin’s first Python example..
My setup -
LND Raspberry PI connecting to it via Putty SSH (blockchain up to date and running fine, eg getblockchain info)
I’ve installed python with: sudo apt-get install libssl-dev
Process I followed -
Once connected through SSH/Putty, I change to my “bitcoin” user
Using nano, I create a “rpc_example.py” file following the code in the book
Then back at the cli I run it with:
“Python rpc_example.py”
And I get this error below which shows its fallen over at the first line - import from bitcoin.rpc import RawProxy
“No module named bitcoin.rpc”
Confusion: I’m not sure if this is an access issue cos of the way I’m connecting or if I’m mixed up between JSON-RPC vs Cli or I’m not following/running things from the right location.
Help please? or maybe I just need to first get a better grip on the fundamentals of Linux/python?
Edit 6/1/18: Success. This group rocks.
I wasn’t going to continue with Mastering Bitcoin if I didn’t get past that exercise - but thanks to you for all the timely advice and suggestions i now can. So much learning in the doing that’s for sure. You can see my finding-my-way-in-the-dark-experience (with the fantastic guidance by this group) in my comments below.
submitted by Haso_04 to Bitcoin [link] [comments]

Unitimes AMA | Danger in Blockchain, Data Protection is Necessary

Unitimes AMA | Danger in Blockchain, Data Protection is Necessary
https://preview.redd.it/22zrdwgeg3m31.jpg?width=1280&format=pjpg&auto=webp&s=1370c511afa85ec06cda6843c36aa9289456806d
At 10:30 on September 12, Unitimes held the 40th online AMA about blockchain technologies and applications. We were glad to have Joanes Espanol , CEO and CTO of Amberdata, to share with us on ‘’Danger in Blockchain, Data Protection is Necessary‘’ . The AMA is composed of two parts : Fixed Q&A and Free Q&A. Check out the details below!

Fixed Q&A

  1. Please introduce yourself and Amberdata
Hi everybody, my name is Joanes Espanol and I am co-founder and CTO of Amberdata. Prior to founding Amberdata, I have worked on several large scale ingestion pipelines, distributed systems and analytics platforms, with a focus on infrastructure automation and highly available systems. I am passionate about information retrieval and extracting meaning from data.
Amberdata is a blockchain and digital asset company which combines validated blockchain and market data from the top crypto exchanges into a unified platform and API, enabling customers to operate with confidence and build real-time data-powered applications.
  1. What type of data does the API provide?
The advantage and uniqueness of Amberdata’s API is the combination of blockchain and pricing data together in one API call.
We provide a standardized way to access blockchain data (blocks, transactions, account information, etc) across different blockchain models like UTXO (Bitcoin, Litecoin, Dash, Zcash...) and Account Based (Ethereum...), with contextualized pricing data from the top crypto exchanges in one API call. If you want to build applications on top of different blockchains, you would have to learn the intricacies of each distributed ledgers, run multiple nodes, aggregate the data, etc - instead of spending all that time and money, you can start immediately by using the APIs that we provide.
What can you get access to? Accounts, account-balances, blocks, contracts, internal messages, logs and events, pending transactions, security audits, source code, tokens, token balances, token transfers, token supplies (circulating & total supplies), transactions as well as prices, order books, trades, tickers and best bid and offers for about 2,000 different assets.
One important thing to note is that most of the APIs return validated data that anybody can verify by themselves. Blockchain is all about trust - operating in a hostile and trustless environment, maintaining consensus while continuously under attack, etc - and we want to make sure that we maintain that level of trust, so the API returns all the information that you would need to recalculate Merkle proofs yourself, hence guaranteeing the data was not tampered with and is authentique.
  1. Why is it important to combine blockchain and market data?
Cryptoeconomics plays a key role in the blockchain world. One simple way to explain this is to look at why peer-to-peer file sharing systems like BitTorrent failed. These file sharing protocols were an early form of decentralization, with each node contributing to and participating in this “global sharing computer”. The issue with these protocols is that they relied on the good will of each participant to (re-)share their files - but without economic incentive, or punishment for not following the rules, it opened the door to bad behavior which ultimately led to its demise.
The genius of Satoshi Nakamoto was to combine and improve upon existing decentralized protocols with game theory, to arrive at a consensus protocol able to circumvent the Byzatine’s General Problem. Now participants have incentives to follow the rules (they get financially rewarded for doing so by mining for example, and penalized for misbehaving), which in turn results in a stable system. This was the first time that crypto-economics were used in a working product and this became the base and norm for a lot of the new systems today.
Pricing data is needed as context to blockchain data: there are a lot of (ERC-20) tokens created on Ethereum - it is very easy to clone an existing contract, and configure it with a certain amount of initial tokens (most commonly in the millions and billions in volume). Each token has an intrinsic value, as determined by the law of supply and demand, and as traded on the exchanges. Price fluctuations have an impact on the adoption and usage, meaning on the overall transaction volume (and to a certain extent transaction throughput) on the blockchain.
Blockchain data is needed as context to market data: activity on blockchain can have an impact on market data. For example, one can look at the incoming token transfers in the Ethereum transaction pool and see if there are any impending big transfers for a specific token, which could result in a significant price move on the other end. Being able to detect that kind of movement and act upon it is the kind of signals that traders are looking for. Another example can be found with token supplies: exchanges want to be notified as soon as possible when a token circulating supply changes, as it affects their trading ability, and in the worst case scenario, they would need to halt trading if a token contract gets compromised.
In conclusion, events on the blockchain can influence price, and market events also have an impact on blockchain data: the two are intimately intertwined, and putting them both in context leads to better insights and better decision making.
  1. All the data you provide is publicly available, what gives?
Very true, all this data is publicly available, that is one of the premises and fundamentals of blockchain models, where all the data is public and transparent across all the nodes of the network. The problem is that, even though it is publicly available, it is not quick, not easy and not cheap to access.
Not quick: blockchain data structures were designed and optimized for achieving consensus in a hostile and trustless environment and for internal state management, not for random access and overall search. Imagine you want to list all the transactions that your wallet address has participated in? The only way to do that would be to replay all the transactions from the beginning of time (starting at the genesis block), looking at the to and from addresses and retain only the ones matching your wallet: at over 500 million of transactions as of today, it will take some unacceptable amount of time to retrieve that list for a customer facing application.
Not easy: Some very basic things that one would expect when dealing with financial assets and instruments are actually very difficult to get at, especially when related to tokens. For example, the current Ether balance of a wallet is easy to retrieve in one call to a Geth or Parity client - however, looking at time series of these balances starts to be a little hairy, as not all historical state is kept by these clients, unless you are running a full archive node. Looking at token holdings and balances gets even more complicated, as most of the token transfers are part of the transient state and not kept on chain. Moreover, token transfers and balance changes over time are triggered by different mechanisms (especially when dealing with contract to contract function calls), and detecting these changes accurately is prone to errors.
Not cheap: As mentioned above, most of the historical data and time series metrics are only available via a full archive node, which at the time of writing requires about 3TB of disk space, just to hold all the blockchain state - and remember, this state is in a compressed and not easily accessible format. To convert it to a more searchable format requires much more space. Also, running your own full archive node requires constant care, maintenance and monitoring, which has become very expensive and prohibitive to run.
  1. Who uses your API today and what do they do with it?
A wide variety of applications and projects are using our API, across different industries ranging from wallets and trust funds (DappRadar), to accounting and arbitrage firms (Moremath), including analytics (Stratcoins) and compliance & security companies (Blue Swan). Amberdata’s API is attractive to many different people because it is very complete and fast, and it provides additional data enrichment not available in other APIs, and because of these, it appeals to and fits nicely with our customers use cases:
· It can be used in the traditional REST way to augment your own processes or enrich your own data with hard to get pieces of information. For example, lots of our users retrieve historical information (blocks and transactions) and relay it in their applications to their own customers, while others are more interested in financial data (account & token balances) and time series for portfolio management.
https://medium.com/amberdata/keep-it-dry-use-amberdatas-api-9cdb222a41ba
· Other projects are more in need of real-time up-to-date data, for which we recommend using our websockets, so you can filter out data in real-time and match your exact needs, rather than getting the firehose of information and having to filter out and discard 99% of it.
· We have a few research projects tapping into our API as well. For example, some of our customers want access to historical market data to backtest their trading strategies and fine-tune their own algorithms.
· Our API is also fully Json RPC compliant, meaning some people use it as a drop-in replacement for their own node, or as an alternative to Infura for example. We have some customers using both Amberdata and Infura as their web3 providers, with the benefits of getting additional enriched data when connecting to our API.
· And finally, we have also built an SDK on top of the API itself, so it is easier to integrate into your own application (https://www.npmjs.com/package/web3data-js).
We also have several subscriptions to match your needs. The developer tier is free and gets you access to 90% of all the data. If you are not sure about your usage patterns yet, we recommend the on-demand plan to get started, while for heavy users the professional and enterprise plans would be more adequate - see https://amberdata.io/pricing for more information.
All and all, we try really hard to make it as easy as possible to use for you. We do the heavy lifting, so you don’t have to worry about all the minutia and you can focus on bringing value to your customers. We work very closely with our customers and continuously improve upon and add new features to our API. If something is not supported or you want something that is not in the API, chances are we already have the data, do not hesitate to ask us ;)
  1. Amberdata recently made some headlines for discovering a vulnerability on Parity client. Can you tell us a bit more about it?
This is an interesting one. One of our internal processes flagged a contract, and more specifically the balanceOf(...) call: it was/is taking more than 5 seconds to execute (while typically this call takes only a few milliseconds). While investigating further, we started looking at the debug traces for that contract call and were pretty surprised when a combination of trace_call+vmTrace crashed our Parity node - and not just randomly, the same call would exhibit the exact same behavior each time, and on different Parity nodes. It turns out that this contract is very poorly written, and the implementation of balanceOf(...) keeps on looping over all the holders of the token, which eventually runs out of memory.
Even though this is a pretty severe bug (any/all Parity node(s) can be remotely shutdown with just one small call to its API), in practice the number of nodes at risk is probably small because only operators who have enabled public facing RPC calls (and possibly the ones who have enabled tracing as well) are affected - which are both disabled by default. Kudos to the Parity team for fixing and releasing a patch in less than 24 hours after the bug was reported!
  1. How do you access the data? How do I get started?
We sometimes get the question, “I do not know how to code, can I still use your data?”, and it is possible! We have built a few dashboards on our platform, and you can visualize and monitor different metrics, and get alerts: https://amberdata.io/dashboards/infrastructure.
A good starting point is to use our Postman collection, which is pretty complete and can give you a very good overview of all the capabilities: https://amberdata.io/docs/libraries and https://www.getpostman.com/collections/79afa5bafe91f0e676d6.
For more advanced users, the REST API is where you should start, but as I mentioned earlier, how to access the data depends on your use case: REST, websockets, Json RPC and SDK are the most commonly ways of getting to it. We have a lot of tutorials and code examples available here: https://amberdata.io/docs.
For developers interested in getting access to Amberdata’s blockchain and market data from within their own contract, they can use the Chainlink Oracle contract, which integrates directly with the API:
https://medium.com/amberdata/smart-contract-oracles-with-amberdata-io-358c2c422d8a
  1. Amberdata just recently celebrated 2 years birthday. What is your proudest accomplishment? Any mistake/lesson you would like to share with us?
The blockchain and crypto market is one of the fastest evolving and innovating markets ever, and a very fast paced environment. Having been heads down for two years now, it is sometimes easy to lose sight of the big picture. The journey has been long, but I am happy and proud to see it all come together: we started with blockchain data and monitoring/alerting, added search, validation and derived data (tokens, supplies, etc) along the way, and finally market data to close the loop on all the cryptoeconomics. Seeing the overall engagement from the community around our data is very gratifying: API usage climbing up, more and more pertinent and relevant questions/suggestions on our support channels, other projects like Kadena sending us their own blockchain data so it can be included in Amberdata’s offering… all of these makes me want to do more :)

Free Q&A

---Who are your competitors? What makes you better?
There are a few data providers out there offering similar information as Amberdata. For example, Etherscan has very complete blockchain data for Ethereum, and CoinmarketCap has assets rankings by market cap and some pricing information. We actually did a pretty thorough analysis on the different data providers and they pros and cons:
https://medium.com/amberdata/which-blockchain-data-api-is-right-for-you-3f3758efceb1
What makes Amberdata unique is three folds:
· Combination of blockchain and market data: typically other providers offer one or the other, but not both, and not integrated with each other - with Amberdata, in one API call I can get blockchain and historically accurate pricing data at the same time. We have also standardized access across multiple blockchains, so you get one interface for all and do not have to worry about understanding each and every one of them.
· Validated & verifiable data: we work hard to preserve transparency and trust and are very open about how our metrics are calculated. For example, blockchain data comes with all the pieces needed to recompute the Mekle proofs so the integrity of the data can be verified at any moment. Also, additional metrics like circulating supply are based on tangible and very concrete definitions so anybody can follow and recalculate them by themselves if needed.
· Enriched data: we have spent a lot of time enriching our APIs with (historical) off chain data like token names and symbols, mappings for token addresses and tradable market pairs, etc. At the same time, our APIs are very granular and provide a level of detail that only a few other providers offer, especially with market data (Level 2 with order books across multiple exchanges, Best Bid Offers, etc).
That's all for the 40th AMA. We should like to thank all the community members for their participation and cooperation! Thanks, Joanes!
submitted by Unitimes_ to u/Unitimes_ [link] [comments]

Looking for BCH testnet API

Hi everyone,

I'm trying to find a publicly accessible Bitcoin Cash testnet API that doesn't require an account or at the very least doesn't need my private keys, and supports both address balance lookups and posting of raw transactions. I'm using Bitcoin ABC to support this functionality locally but I also need to be able to do this through a remote RPC API (preferably JSON-based).

Here's a quick rundown of where I've looked so far:

https://rest.bitcoin.com/
This seems like the most obvious option and supports the functionality I'm looking for but it appears that it's only for livenet. For example, here's a testnet address with a little over 0.5 tBCH:
https://explorer.bitcoin.com/tbch/address/n22LEtuzgniSXeMBoUjgpqkDm6Nf7SEMi4
...the API call, however, doesn't like this address:
https://rest.bitcoin.com/v2/address/details/bchtest:qrs0pew4sa7qtdz36jwc5rlwlmsfrlhqxuawvcfxsl
{"error":"Invalid network. Trying to use a testnet address on mainnet, or vice versa."}
Maybe there's another API endpoint I should be using here? Unfortunately I'm not seeing it in the documentation.

https://blockdozer.com/

The API documentation doesn't appear to include posting of transactions, raw or otherwise, and the Broadcast Raw Transaction page appears to be for Bitcoin only.

https://dev.btc.com/docs/js

This API appears to require the BlockTrail SDK and the "Sending Transactions" section seems to indicate that it uses a custodial/shared account model: "The SDK handles all the logic of creating the transaction, signing it with your primary key and sending it to the API so BTC.com can co-sign the transaction and send it to the bitcoin network."
Although I understand that it's not the same thing, to me this isn't too far from requiring my private keys (i.e. it's a loss of signing control).

https://www.bitgo.com/api/v2/

This API supports tBCH but uses hosted wallets (i.e. requires an account and private key[s]).

https://blockchair.com/

The API documentation has a section on broadcasting transactions but only lists bitcoin, bitcoin-cash, ethereum, and litecoin as available chains.

https://docs.cryptoapis.io/

Requires an account (API key) but appears to support the tBCH functionality I'm looking for. It's on the back-burner if I can't find anything else.

https://www.blockchain.com/api/blockchain_wallet_api

It looks like this API requires the developer to create and host a wallet with them in order to send transactions. I'm not sure if I'd have any control over the private key(s) here.

https://bitpay.com/api

Appears to require an account and doesn't appear to fully support the functionality I'm looking for.

If I've missed anything or I'm mistaken about what I've looked into I would very much appreciate your feedback! More importantly, if you know of a service that I haven't listed and that can do what I need it to do I thank you in advance for sharing it.
submitted by monican_agent to Bitcoincash [link] [comments]

Soo after almost 3 months of setting up I have my own LN full node running on RP3

Soo after almost 3 months of setting up I have my own LN full node running on RP3
I have been eager to try LN mainnet since the very beginning of it. I've found out about lnd, eclair, zap and other wallets but every scenario I tried to use it failed because of critical issues:
  • eclair does not really constitute a wallet, it's more like a credit card - you can send money but not receive it
  • lnd is okay, but requires a server and tons of resources for maintaining a full node, can't be used securely, efficiently and mobily at the same time
  • zap offers some cloud wallet (in testnet!) by default, this is a serious misunderstanding of my cryptoanarchy needs
  • web wallets - ah, forget it
So I've decided to use my Raspberry Pi with a very old laptop HDD attached (200GB so the pruning function has to be used) to create a backend wallet service and zap desktop (temporarily!) as my frontend control panel.
https://preview.redd.it/0vcq147887q11.png?width=1024&format=png&auto=webp&s=7bb6eccdd4110a857e5af0400acc2d7e1ee7ee85
Setting up Pi is easy, lots of tutorials over the internet, not gonna discuss it here. Then I had to obtain bitcoind (current rel: bitcoin-0.17.0-arm-linux-gnueabihf.tar.gz) and lnd (lnd-linux-armv7-v0.5-beta.tar.gz), create a bitcoin technical user, deploy the tools, configure and install new systemd services and go through the configs. This is a tricky part, so let's share:
# Generated by https://jlopp.github.io/bitcoin-core-config-generato # This config should be placed in following path: # ~/.bitcoin/bitcoin.conf # [core] # Set database cache size in megabytes; machines sync faster with a larger cache. Recommend setting as high as possible based upon machine's available RAM. dbcache=100 # Keep at most  unconnectable transactions in memory. maxorphantx=10 # Keep the transaction memory pool below  megabytes. maxmempool=50 # Reduce storage requirements by only storing most recent N MiB of block. This mode is incompatible with -txindex and -rescan. WARNING: Reverting this setting requires re-downloading the entire blockchain. (default: 0 = disable pruning blocks, 1 = allow manual pruning via RPC, greater than 550 = automatically prune blocks to stay under target size in MiB). prune=153600 # [network] # Maintain at most N connections to peers. maxconnections=40 # Use UPnP to map the listening port. upnp=1 # Tries to keep outbound traffic under the given target (in MiB per 24h), 0 = no limit. maxuploadtarget=5000 # [debug] # Log IP Addresses in debug output. logips=1 # [rpc] # Accept public REST requests. rest=1 # [wallet] # Do not load the wallet and disable wallet RPC calls. disablewallet=1 # [zeromq] # Enable publishing of raw block hex to 
. zmqpubrawblock=tcp://127.0.0.1:28332 # Enable publishing of raw transaction hex to
. zmqpubrawtx=tcp://127.0.0.1:28333 # [rpc] # Accept command line and JSON-RPC commands. server=1 # Username and hashed password for JSON-RPC connections. The field comes in the format: :$. RPC clients connect using rpcuser=/rpcpassword= arguments. You can generate this value with the ./share/rpcauth/rpcauth.py script in the Bitcoin Core repository. This option can be specified multiple times. rpcauth=xxx:yyy$zzz
Whooaa, this online config generator is really helpful, but I still had to manually correct a few things. The last line is obviously generated by rpcauth.py, I disabled the wallet functionality as lnd is going to take care of my funds. ZMQ is not available to the network so only my LND can use it, RPC usage I still have to think through a little, in general I would like to have my own block explorer some day but also be safe from any hacking attempts (thus I would need at least 2 RPC ports/user accounts - one for lnd, one for block explorer frontend). No ports open on firewall at this time, only UPnP is active and gently opens 8333 for block/tx transfers.
Now, synchronizing the blockchain took me time from mid-July to early September... The hard drive is really slow, also my external HDD drive has some trouble with its A/C adapter so Pi was getting undervoltage alerts all the time. Luckily, it is just downclocking when it happens and slowly but steadily synchronized the whole history. After all, I'm not paying even $5 monthly for a VPS, it is by design the cheapest hardware I could use to set up my LN wallet.
When bitcoind was ready (I've heard some stories about btcd but I don't trust this software yet, sorry), it's time to configure lnd.conf:
[Application Options] debuglevel=trace rpclisten=0.0.0.0:10009 externalip=X.X.X.X:9735 listen=0.0.0.0:9735 alias=X color=#XXXXXX [Bitcoin] bitcoin.active=1 bitcoin.mainnet=1 bitcoin.node=bitcoind [Bitcoind] bitcoind.rpchost=127.0.0.1 bitcoind.rpcuser=X bitcoind.rpcpass=X bitcoind.zmqpubrawblock=tcp://127.0.0.1:28332 bitcoind.zmqpubrawtx=tcp://127.0.0.1:28333 
Here I've had to XXX a little more fields, as not only the bitcoind RPC credentials are stored here, but also my node's public information (it should be illegal to run nodes without specifically selected color and alias!). It is public (and I had to open port 9735 on my firewall), but not necessarily connected to my reddit account for most of the adversaries, so let's keep it this way. In fact, I also see a security vulnerability here: my whole node's stability depends on the IP being static. I could swap it for a .tk domain but who can tell if the bad guys won't actively fight DNS system in order to prevent global economic revolution? As such, I would rather see node identification in LN based on a public key only with possible *hints* of last-known-ip-address but the whole discovery should be performed by the nodes themself in a p2p manner, obviously preventing malicious actors from poisoning the network in some way. For now, I consider the IP stability a weak link and will probably have to pay extra Bitcoin TX fees when something happens to it (not much of a cost luckily!).

https://preview.redd.it/hjd1nooo77q11.png?width=741&format=png&auto=webp&s=14214fc36e3edf139faade930f4069fc31a3e883
Okay then, lnd is up and running, had to create a wallet and give it a night for getting up to speed. I don't know really what took it so long, I'm not using Windows nor 'localhost' in the config so the issues like #1027 are not the case. But there are others like #1545 still open so I'm not going to ponder much on this. I haven't really got any idea how to automatically unlock the wallet after Pi restart (could happen any time!), especially since I only tried to unlock it locally with lncli (why would I enter the password anywhere outside that host?), but let's say that my wallet will only be as stable as my cheap hardware. That's okay for the beta phase.
Finally, zap-desktop required me to copy tls.cert and admin.macaroon files to my desktop. If my understanding of macaroon (it's like an authentication cookie, that can later be revoked) is correct then it's not an issue, however it would be nice to have a "$50 daily limit" macaroon file in the future too, just to avoid any big issues when my client machine gets stolen. Thanks to this, I can ignore the silly cloud-based modes and have fully-secure environment of my home network being the only link from me to my money.
https://preview.redd.it/11bw3dgw47q11.png?width=836&format=png&auto=webp&s=b7fa7c88d14f22441cbbfc0db036cddfd7ea8424
Aaand there it is. The IP took some time to advertise, I use 1ml.com to see if my node is there. The zap interface (ZapDesktop-linux-amd64-v0.2.2-beta.deb) lacks lots of useful information so I keep learning lncli syntax to get more data about my new peers or the routes offered. The transactions indeed run fast and are ridiculously cheap. I would really love to run Eclair with the same settings but it doesn't seem to support custom lnd (why?). In fact, since all I need is really a lncli wrapper, maybe it will be easy to write my own (seen some web gui which weighs 700MB after downloading all dependencies with npm - SICK!). Zap for iOS alpha test registration is DOWN so I couldn't try it (and I'm not sure if it allows custom lnd selection), Zap for Android doesn't even exist yet... I made a few demo transactions and now I will explore all those fancy t-shirt stores as long as the prices are still in "early investor" mode - I remember times when one could get 0.001 BTC from a faucet...
https://preview.redd.it/42sdyoce57q11.png?width=836&format=png&auto=webp&s=7ec8917eaf8f3329d51ce3e30e455254027de0ee
If you find any of the facts presented by me false, I am happy to find out more in the discussion. However what I did I did mostly for fun, without paying much attention to the source code, documentation and endless issue lists on github. By no means I claim this tutorial will work for you but I do think I shared the key points and effort estimations to help others decide if they want a full-node LN client too. I'm also interested in some ideas on what to do with it next (rather unlikely that I will share my lnd admin.macaroon with anyone!) especially if it gives me free money. For example, I can open 1000 channels and start earning money from fees, although I no longer have more Bitcoins than the LN capacity yields... I will probably keep updating the software on my Pi until it leaves beta phases and only then will pour more money inside. I'm also keen on improving the general security of my rig and those comments I will answer more seriously.
submitted by pabou to Bitcoin [link] [comments]

🚀launch🚀 of developer.ravencoin.online

Intro

Today i am happy to announce the release of ravencoin.online's new developer platform Developer.ravencoin.online. Featuring the powerful RVNBOX-SDK, REST, GUI for deploying and scaling and a Market for monetizing your work.
committed to helping change the world with Ravencoin by providing a best in class suite of tools for developers to SUPERCHARGE their RVN workflow.

RVNBOX

ravencoin.online’s developer platform is based on the popular RVNBOX javascript framework. Offering utility methods for Mnemonics, HDNodes, ECPairs, Crypto, Address conversion, Transactions and much more.
Learn More...

REST

The RVN JSON RPC over HTTP including a fully documented and interactive GUI which developers can use to test their ideas and confirm their code is making proper API calls.
Learn More...

GUI

BIP44 development wallet. Convert between address Types. Create QR codes for WIF, XPub and XPrivs. Sign and verify messages.
Learn More...

Mastering Ravencoin

Based on Mastering Bitcoin by Andreas M. Antonopoulos, Mastering Ravencoin is the ultimate master level course. Covering topics ranging from "What is Ravencoin?", "How Ravencoin Works", "Keys, Addresses, Wallets", "Transactions", "The Blockchain", "Mining and Consensus" and much more Mastering Ravencoin will take your knowledge from hobbyist to professional step-by-step.
Learn More...

Tutorials

Step by step instructions to build Ravencoin apps from scratch. See real world examples get built and have your own working copies to bootstrap your project from.
Learn More...

Open Source

Spot an error? Want to add something? Developer.ravencoin.online is 100% open source under the MIT Open Source License. Clone the developer.ravencoin.online repo and create a pull request. all tools can be found on our github organization raven-community

Summary

The Ravencoin developer community is growing quickly. Daily new apps launch which push the boundaries of what is possible with blockchain technology. We're committed to continuing to release cutting-edge open source tooling to help developers go from idea to application. Watch this space for blog posts, video streams and more.
submitted by MSFTserver to Ravencoin [link] [comments]

RVN JSON RPC over HTTP for Developers

recently launched a new tool forked from BCH/bitcoin.com
http://rest.ravencoin.online is a RVN JSON RPC over HTTP including a fully documented and interactive GUI which developers can use to test their Ravencoin ideas and confirm their code is making proper API calls.
example: http://rest.ravencoin.online/v2/address/details/RVNxMSFT1uhXTrJsmHSdUzEHKtBBJyUu4z
submitted by MSFTserver to Ravencoin [link] [comments]

Ravencoin Open Developer Meeting - 1/4/2019

[14:04] Hi everyone! [14:04] :dabbitwave: [14:04] Hey Everybody! [14:04] Hello 😃 [14:04] Sorry we're getting started a bit late. [14:04] Topics: SLC Meetup (March 15th) [14:04] 👋 [14:04] Roadmap breakdown - posted to github [14:05] IPFS (integration) [14:05] greetings 👋 [14:05] So, SLC Meetup on the 15th! [14:05] Great! [14:05] Hi! [14:06] Hi all — a special thanks to the developers and congratulations on an amazing first year!!! # [14:06] <[Dev] Blondfrogs> Hello Everyone! [14:07] We have a tentative agenda with @Tron , @corby speaking. [14:08] We would like to have nice walkthrough of the Raven DevKit for the meetup. [14:08] We are planning on hosting a meetup in SLC at the Overstock building on March 15th from 6:00pm-9:00pm. It is free admission, but there is a page on meetup.com where people can rsvp so that we have a somewhat accurate headcount for food. [14:08] sup guys [14:08] hey russ [14:09] We are planning on having a few speakers and have allotted a bit of time at the end for people to meet and greet each other. [14:09] can you guys link us to the page somewhere when thats available? 😄 [14:10] free food?! [14:10] todays topic? [14:10] yeah can we indicate pepperoni pizza [14:10] Sounds good to me @Jeroz Nothing ordered yet though. 😃 [14:10] only pepperoni pizza is served at true blockchain meetings right [14:10] :blobhide: [14:10] Absolutely. The itinerary just needs to be finalized and then I'll make a broad post about the rest of the details. [14:11] https://www.meetup.com/Salt-Lake-City-salt-lake-city-Meetup/ [14:11] 😭 so far away [14:11] West Coast! [14:11] @MTarget But there's pizza, so worth the travel time. [14:11] lol [14:12] I'll be watching the stream if its available since i'm from montreal/canada 😛 [14:12] Ah yes, I love $300 pizza 😉 [14:12] as long as I get to see your smiling faces @Tron @RavencoinDev then it's worth the time [14:12] We'll be there. [14:12] We'll be messaging additional details as they get finalized. [14:12] Greeting and salutations! [14:12] sup [14:13] Hey, $300 is considerably cheaper than 2 $3,700,000 pizzas. [14:14] Ok, switching topics... [14:14] yeah its a way to fly, [14:14] question is whether those piza's will be paid for in RVN coin or not :ThinkBlack: [14:14] Roadmap [14:14] It hasn't changed, just added some detail. [14:14] https://github.com/RavenProject/Ravencoin/tree/masteroadmap [14:15] nice [14:15] This now links to a breakdown for messaging, voting, anti-spam, and rewards (dividends) [14:15] will there be any additional RPC functionality coming in the future, thinking in terms of some functions that are only available in ravencore-lib [14:15] apologies if now is not time to ask questions, i can wait for later [14:15] "Phase 7 - Compatibility Mode" - that's new 😮 [14:15] The protocol for messaging is pretty well established, but the rest isn't in stone (code) yet. [14:16] can you give us details on compatibility mode? [14:16] In broad brush strokes. [14:17] The idea is to allow ravend to act as a daemon that looks like a single coin. [14:17] so ravend that only works with the bitcoin asset? [14:18] interesting [14:19] So you start it with an option to only work with a single asset/token account or something? [14:19] hmm compelling what is the reason for this? some kind of scale or performance? [14:19] ^ [14:19] Example: Configure ravend to listen for transfer RPC call for senttoaddress or sendfrom, but for a specific asset. This would allow easy integration into existing system for assets. [14:20] Only the daemon or the whole wallet UI? [14:20] yeah thats great, rpc functions dont allow us to do this yet, if i recall [14:20] or at least we depend more on ravencore lib [14:20] so like asset zmq [14:20] that's smart [14:20] @Tron it also sounds like it makes our life easier working with RPC, instead of core all the time for some functionality [14:21] if i understand correctly anyways [14:21] So you could run numerous instances of ravend each on their own network and RPC port, each configured for a different asset. You would need some balance of RVN in each one to cover transaction fees, then. [14:21] id be curious to know what all the advantages are of this [14:21] one more question, how would i decentralize the gateway between bitcoin mainnet/ravencoin mainnet? in the current RSK implementation they use a federated gateway, how would we avoid this? [14:21] it sounds neato [14:21] Just the daemon. The alternative is to get exchanges to adapt to our RPC calls for assets. It is easier if it just looks like Bitcoin, Litecoin or RVN to them, but it is really transferring FREE_HUGS [14:22] That makes sense. Should further increased exchange adoption for each asset. [14:22] hmm yeah its just easier for wallet integration because its basically the same as rvn and bitcoin but for a specific asset [14:22] so this is in specific mind of exchange listings for assets i guess [14:23] if i understand rightly [14:23] @traysi Gut feel is to allow ravend to handle a few different assets on different threads. [14:23] Are you going to call it kawmeleon mode? [14:23] Lol [14:23] I read that as kaw-melon mode. [14:24] same lol [14:24] so in one single swoop it possible to create a specific wallet and server daemon for specific assets. great. this makes it easier for exchanges, and has some added advantages with processing data too right? [14:24] Still keeping a RVN balance in the wallet, as well, Tron. How will that work is sendtoaddress sends the token instead of the RVN? A receive-RVN/send tokens-only wallet? [14:25] @traysi Yes [14:25] sendtoaddress on the other port (non RVN port) would send the asset. [14:25] This will be a hugely useful feature. [14:25] ^ [14:26] @Tron currently rpc function not support getaddresses senttowallet and this has to be done in ravencore lib, will this change you propose improve this situation [14:26] Config might look like {"port":2222, "asset":"FREE_HUGS", "rpcuser":"hugger", "rpcpass":"gi3afja33"} [14:26] how will this work cross-chain? [14:28] @push We'd have to go through the rpc calls and work out which ones are supported in compatibility mode. Obviously the mining ones don't apply. And some are generic like getinfo. [14:28] ok cool 👍 cheers [14:29] for now we continue using ravencore lib for our plans to keep track i just wondering if better way [14:29] as we had some issue after realising no rpc function for getting addresses of people who had sent rvn [14:29] @push | ravenland.org all of the node explorer and ravencore-lib functionality is based on RPC (including the addressindex-related calls). Nothing you can't do with RPC, although I'm not sure of the use cases you're referring to.. [14:29] interesting, so ravencore lib is using getrawtransaction somehow [14:29] i thought this may be the case [14:29] that is very useful thankyou for sharing this [14:30] look into addressindex flag and related RPC calls for functions that operate on addresses outside your wallet [14:30] thank you that is very useful, tbh i am not very skilled programmer so just decoding the hex at the raven-cli commandline was a challenge, i shall look more into this, valued information thanks as this was a big ? for us [14:31] Ok, things have gone quiet. New topic. [14:31] IPFS (integration) [14:31] GO [14:33] ... [14:33] <[Dev] Blondfrogs> So, we have been adding ipfs integration into the wallet for messaging. This will allow the wallets to do some pretty sweet stuff. For instance, you will be able to create your ipfs data file for issuing an asset. Push it to ipfs from the wallet, and add the hash right into the issuance data. This is going to allow for a much more seamless flow into the app. [14:34] <[Dev] Blondfrogs> This ofcourse, will also allow for users to create messages, and post them on ipfs and be able to easily and quickly format and send messages out on the network with ipfs data. [14:34] It will also allow optional meta-data with each transaction that goes in IPFS. [14:34] will i be able to view ipfs images natively in the wallet? [14:34] <[Dev] Blondfrogs> Images no [14:34] We discussed the option to disable all IPFS integration also. [14:35] @russ (kb: russkidooski) Probably not. There's some risk to being an image viewer for ANY data. [14:35] No option in wallet to opt into image viewing? [14:35] cool so drag and drop ipfs , if someone wanted to attach an object like an image or a file they could drag drop into ui and it create hash and attach string to transaction command parameters automatically [14:35] We could probably provide a link -- with a warning. [14:35] nomore going to globalupload.io [14:35] :ThinkBlack: [14:35] I understand that the wallet will rely on globalupload.io. (phase 1). Is it not dangerous to rely on an external network? Or am I missing something? [14:36] hmm [14:36] interesting, i suppose you could hash at two different endpoints and compare them [14:36] if you were that worried [14:36] and only submit one to the chain [14:36] You will be able to configure a URL that will be used as an IPFS browser. [14:36] Oh ic [14:36] you wont flood ipfs because only one hash per unique file [14:36] <[Dev] Blondfrogs> There are multiple options for ipfs integration. We are building it so you can run your own ipfs node locally. [14:36] <[Dev] Blondfrogs> or, point it to whatever service you would like. e.g. cloudflare [14:36] this is very cool developments, great to see this [14:37] Just like the external block explorer link currently in preferences. [14:37] @[Dev] Blondfrogs what about a native ipfs swarm for ravencoin only? [14:37] We have discussed that as an option. [14:37] @push | ravenland.org Considering having a fallback of upload through globalupload.io and download through cloudflare. [14:37] <[Dev] Blondfrogs> @russ (kb: russkidooski) We talked about that, but no decisions have been made yet. [14:37] yeah, i would just use two endpoints and strcompare the hash [14:37] as long as they agree good [14:37] submit tran [14:38] else 'potentially mysterious activity' [14:38] ? [14:38] if you submitted the file to ipfs api endpoints [14:38] Will the metadata just be a form with text only fields? [14:39] and then you would get 2 hashes, from 2 independent services [14:39] that way you would not be relying on a central hash service [14:39] and have some means of checking if a returned hash value was intercepted or transformed [14:39] i was answering jeroz' question [14:40] about relying on a single api endpoint for upload ipfs object [14:40] We have also kicked around the idea of hosting our own JSON only IPFS upload/browse service. [14:41] I have a service like this that is simple using php [14:41] we only use it for images right now [14:41] but fairly easy to do [14:41] Yup [14:42] Further questions about IPFS? [14:43] contract handling? file attach handling? or just text fields to generate json? [14:44] trying to get an idea of what the wallet will offer for attaching data [14:44] Probably just text fields that meet the meta-data spec. [14:44] ok noted [14:44] What do you mean by contract handling @sull [14:45] We won't prevent other hashes from being added. [14:45] asset contract (pdf etc) hash etc [14:45] <[Dev] Blondfrogs> also, being able to load from a file [14:45] got it, thanks [14:47] Let's do some general Q&A [14:48] Maybe just a heads up or something to look for in the future but as of right now, it takes roughly 12 hours to sync up the Qt-wallet from scratch. Did a clean installation on my linux PC last night. [14:48] Any plans or discussions related to lack of privacy of asset transfers and the ability to front run when sending to an exchange? [14:48] ^ [14:48] Is there a way to apply to help moderate for example the Telegram / Discord, i spend alot of time on both places, sometimes i pm mods if needed. [14:49] Any developed plans for Asset TX fee adjustment? [14:49] also this^ [14:49] @mxL86 We just created a card on the public board to look into that. [14:49] General remark: https://raven.wiki/wiki/Development#Phase_7_-_Compatible_Mode = updated reflecting Tron's explanation. [14:49] @mxL86 That's a great question. We need to do some profiling and speed it up. I do know that the fix we added from Bitcoin (that saved our bacon) slowed things down. [14:50] Adding to @mxL86 the sync times substantially increased coinciding with the asset layer activation. Please run some internal benchmarks and see where the daemon is wasting all its cycles on each block. We should be able to handle dozens per second but it takes a couple seconds per block. [14:50] @BW__ no plans currently for zk proofs or anything if that's what you're asking [14:50] You are doing a great job. Is there a plan that all this things (IPFS) could be some day implemented in mobile wallet? Or just in QT? [14:50] i notice also that asset transactions had some effect on sync time as we were making a few. Some spikes i not analysed the io and cpu activity properly but will if there is interest [14:51] we are testing some stuff so run into things i am happy to share [14:51] @BW__ Might look at Grin and Beam to see if we can integrate Mimble Wimble -- down the road. [14:51] yeees [14:51] @J. | ravenland.org work with the telegram mods. Not something the developers handle. [14:51] i love you [14:51] @J. | ravenland.org That would be best brought up with the operators/mods of teh telegram channel. [14:51] @corby @Tron thnx [14:51] @S1LVA | GetRavencoin.org we're planning on bumping fees to... something higher! [14:51] no catastrophic failures, just some transaction too smals, and mempool issues so far, still learning [14:52] @corby i thought that this may happen :ThinkBlack: [14:52] @corby x10? 100x? 1000x? Ballpark? [14:52] Definitely ballpark. [14:52] 😃 [14:52] 😂 [14:52] Is a ballpark like a googolplex? [14:53] @push | ravenland.org asset transactions are definitely more expensive to sync [14:53] yes yes they are [14:53] they are also more expensive to make i believe [14:53] 10,000x! [14:53] as some sync process seems to occur before they are done [14:53] @traysi ★★★★★ thanks for the suggestions we are going to be looking at optimizations [14:53] But, it is way slower than we like. Going to look into it. [14:53] i do not understand fully its operation [14:53] 1000x at minimum in my opinion [14:53] its too easy to spam the network [14:54] yes there has been some reports of ahem spam lately [14:54] :blobhide: [14:54] 😉 [14:54] cough cough ravenland [14:54] @russ (kb: russkidooski) we're in agreement -- it's too low [14:54] default fee 0.001 [14:54] ^ something around here [14:54] @corby yep we all are i think [14:55] waaay too low [14:55] meaningful transactions start with meaningful capital expense [14:55] though there is another scenario , there are some larger volume, more objective rich use cases of the chain that would suffer considerably from that [14:55] just worth mentioning, as i have beeen thinking about this a lot [14:55] there are some way around, like i could add 1000 ipfs hashes to a single unique entity, i tested this and it does work [14:56] @russ (kb: russkidooski) What would you suggest. [14:57] I had a PR for fee increase and push back. [14:57] Ignore the push back. 0.001 RVN is not even a micro-farthing in fiat terms [14:57] definitely around 1000x [14:57] Vocal minority for sure [14:57] ^ yep [14:57] @russ (kb: russkidooski) That sounds reasonable. [14:57] Couple hundred Fentons [14:58] right now an asset transaction is 0.01 of a penny essentially [14:58] 1 RVN would work now, but not when RVN is over $1. [14:58] yes exactly [14:58] Hi. Late to the party. [14:58] We are also talking about a min fee. The system will auto-adapt if blocks fill up. [14:58] im thinking tron, some heavy transaction use cases would fall out of utility use if that happened [14:58] so whats the thinking there [14:59] is there a way around the problem, bulked ipfs hash transactions? [14:59] 1000x would put us around btc levels [14:59] maybe a minimum 500x? [14:59] @russ (kb: russkidooski) Agreed. [14:59] <[Dev] Blondfrogs> It is time to wrap it up here. Everyone. Thank you all for your questions and thoughts. We will be back in 2 weeks. 😃 [14:59] Small increase and review. [14:59] Thanks all! [14:59] Cheers. [15:00] yeah sorry for 1 million questions guys hope i didnt take up too much time [15:00] cheers all 👍 [15:00] Thanks everyone [15:00] Thanks everyone for participating!!! [15:00] That is what we are here for [15:00] 100x-500x increase, 1000x maximum [15:00] 🍺

submitted by Chatturga to Ravencoin [link] [comments]

Technical community | LBTC tech community member develops a new blockbrowser ‘Thebes’

Technical community | LBTC tech community member develops a new blockbrowser ‘Thebes’
Recently, from the LBTC developer community ,a geek called Chen Jian helped LBTC develope a new block browser — “Thebes” : https://lbtc.me/lbtc/explorer .

https://preview.redd.it/lbfr0m0ev4031.png?width=1059&format=png&auto=webp&s=973796c43edbe3f095b82ad0b967ed31e20c831d
Thebes mainly USES python development language and flask development framework. In terms of database, high-performence MongoDB is selected to cooperate with Mysql. By using transaction id as the primary key of data storage, the speed of transaction information search is greatly improved. In other development respects, the Web server selects nginx, the process management tool selects supevisor and python(gevent), and the timing task tool selects crontab. Meanwhile, in the development process of Thebes, some js and CSS code fragments were optimized and compressed to speed up the page loading speed. In GTmetrix and Yellowlab’s global blockbrowser performance tests, the “Thebes” browser received the highest grade A in core metrics such as CSS snippet authoring, image loading, style & script optimization, linking server resources, and docking cache validators.
Click the link to see the full rating report:
https://gtmetrix.com/reports/lbtc.me/rsXxSwhI
https://yellowlab.tools/result/fcakefdwkk
Compared to existing LBTC blockbrowsers, Thebes has the following innovative features
Global Mining Node Full Information Statistics
In the node page of Thebes, the user can easily find information about all the mining nodes of LBTC. As of May 17, 2019, Beijing time, LBTC has 231 full nodes in operation, among which 17 nodes use the network client of /Satoshi: 0.14.2/70013 version. Of the 231 nodes, two have adopted IPv6’s sixth generation Internet communication scheme. In addition to this basic information, the user can see information for each specific node. For example, the node ranked 81 in votes :(cobo.com/lbtc.7), the address balance, transaction information, block history, voting information, proposal status and so on.

https://preview.redd.it/rw32iyqhv4031.png?width=1075&format=png&auto=webp&s=42c9322b6eeb891b1ff288e8375fe5e9d97fd287
The blockbrowser is also a network interface to JSON-RPC. In order to better count the mining nodes of the LBTC network around the world, the browser does this by continuously sending getaddr information to all accessible network nodes.
Real-time Update of Network Status
In a Bitcoin network, the value of a Bitcoin is proportional to the square of the number of people using the network, which is determined by two parameters: the number of addresses and the daily transaction volume. In the “Thebes” browser, the core parameters of the network, such as daily transaction volume, daily change in the number of newly added Lightning Bitcoin addresses, total number of Lightning Bitcoin holders, average transfer fee and average block volume, are all clearly displayed.

https://preview.redd.it/pukrf8hjv4031.png?width=1074&format=png&auto=webp&s=3e5989be9c95b4fc1bc2e578defddfdb48c4109e
About the Technical Community
The LBTC technology community was officially launched in early April 2019. The core developer of the technology community, W.H.H., announced on Twitter on April 15 that the LBTC foundation would invest more than $1 million in technology community incentives. The open rewards system gives developers from around the world a way to learn about blockchain development, try new ideas and contribute to projects. The technology community is still in its infancy, but it has brought together a group of passionate developers. By writing documents, developing blockbrowsers and wallets, and organizing offline seminars for the developer community, they have helped the community thrive and opened the door to LBTC for many investors and developers.
submitted by LightningBitcoin to LightningBTC [link] [comments]

PSA: SegWit changes the format of data returned by JSON-RPC API, thus making it backward-incompatible

As far as I know, all soft-forks so far were backwards-compatible, i.e. a piece of software working with JSON-RPC API doesn't need to be upgraded when a soft-fork is activated. For example, OP_CHECKLOCKTIMEVERIFY will interpreted by old software as OP_NOP2, which is harmless.
But SegWit is different in this respect because it changes data formats.
For example, suppose you use getblock API to get raw block contents and parse it using bitcore-lib to extract some useful information of new blocks. If you happen to use Bitcoin Core 0.13.1, then after SegWit activation, getblock will return data in new format which bitcore-lib cannot understand, so your application will no longer work.
So if your application happens to use affected APIs (getblock is one of them, but I don't have a complete list), you choice is either to:
(I think Bitcoin Core could easily make all APIs backward compatible by transforming data into the old format for old calls and introducing new calls which return data in new format, e.g. getblocksw for segwit-comptable software. But it's too late for this...)
submitted by killerstorm to Bitcoin [link] [comments]

BitN00B: Help with Python/JSON call : getNewAddress(account) - fail

When making a JSON call, RPC to bitcoind, what is "account" parameter in getNewAddress(account).
Does anyone have actual code to demo this? This is frustrating given how much time I have spent ( a day ),
trying to get this to work in code, trying different things, googling all over, following things to a dead end.

In my Bitcoin QT, I see no reference to "account", I don't know what to put there as a parameter.

Reference:
https://en.bitcoin.it/wiki/Original_Bitcoin_client/API_calls_list#Full_list

If I leave it blank, or I make up something to put there, I get a socket timeout.
I can successfully call: getblockchaininfo() (no params), and a few others with no socket timeout, but not getNewAddress. Anything and everything requiring a parameter fails.
--
Also if anyone can help, I please need a clear concise example of sending a transaction. I can find no good code examples on how to create a raw transaction and send with any Python/RPC/Bitcoind library out there. None. It should not take more than a few lines of code to send a transaction.
  1. set variables in a data structure
  2. make a call to send the transaction (that's it). The library will take params from the data structure, construct a raw transaction, convert that raw transaction to the format it needs when sending over http (json/rpc), as it does and I understand it to work that way.
    I would like to have the raw transaction printed out to debug, both raw transaction and the actual hex code that is sent.
    Thank You,
    JC
submitted by ThisCryptoFail to Bitcoin [link] [comments]

Blocknet FAQ

Community-Produced FAQ document
What Is Blocknet?
The Blocknet is a general-purpose infrastructure for inter-blockchain services. It is designed to enable the emerging “token ecosystem.” The first product build on this infrastructure is a decentralized exchange.
What Does It Do?
The Blocknet enables inter-blockchain services, like decentralized exchange, monetised API consumption, and p2p digital service delivery. These are core enabling features of inter-chain dapps.
How Does It Work?
To support inter-blockchain services, the Blocknet has three core components, which work together to provide three core services.
The core components are:
The core services are:
What Is a Decentralized Exchange?
A decentralized exchange is a service enabling counterparties (which may be people or machines) to exchange one currency or token for another, without the involvement of any third party as an intermediary.
The term “decentralized” denotes matters of control rather than the distribution of processing; the ideal of a decentralized solution is for the parties to a given interaction to be self-sovereign actors, in the sense that no third party is required to act on their behalf in order for the interaction to take place.
How Does a Decentralized Exchange Work?
Exchanges have four core functions:
Hence, in order to be a decentralized exchange, each of these core functions must be decentralized.
The Blocknet decentralizes them in the following ways:
Why Is a Decentralized Exchange a Key Enabler Of the Token Ecosystem?
Decentralized exchange makes blockchain services intrinsically monetizable, removing the friction and high costs of traditional payment networks that have prevented the monetisation of the bulk of the API ecosystem.
Due to the decentralized exchange, consumers of a service may pay in their native token even if the service consumes a different token. In a world in which (a) there are already thousands of blockchains, and (b) blockchains bloat inexorably, and so it is advisable not to support many services per blockchain, monetising inter-chain services is both an operational necessity and an ecosystem-enabling service.
What Coins Does the Decentralized Exchange Support?
The Blocknet was designed to maximise interoperability, and so most blockchain tokens may be integrated with no coding required.
The current integration requirements are:
As a result, the Blocknet supports the majority of cryptocurrencies in existence, and no permission from anyone is required for these to be traded on the exchange.
The current list is:
BAY BTC BLK BLOCK DASH DGB DOGE DYN PIVX LTC MUE NMC SYS VTC VIA BRK BRX ETH NLG QTUM DCR POT PPC XVG MONA FAIR NAV
How Fast Is the Decentralized Exchange?
Instant.
However, note that once you have completed a trade and received coins, you will be dependent on their blockchain’s accepted confirmation time before your coins will be spendable again.
Note: A future enhancement to the decentralized exchange may include a filter on the order book to enable traders to trade coins with less than the number of confirmations conventionally agreed upon as “safe.” This incurs a degree of risk for the benefit of supporting trading styles that require rapidly entering and exiting a position, such as scalping.
How Private Is the Decentralized Exchanged?
Because decentralized exchanges do not require traders to submit KYC information or divulge anything else about themselves to a third party, traders enjoy a naturally high degree of privacy.
However, for most wallets, aspects of transactions are linkable to IP addresses, so in order to obfuscate that, one might use TOR or I2P. The Blocknet’s DHT network overlay does not use IP addresses, however.
Combined with any privacy-centric coin, a decentralized exchange run over IP-obfuscating tech is a near-perfect mixing solution. For example, one may trade some coins for Zcash, sends them to a different address, and then trade back again.
What Are the Possible Applications Of the xBridge Protocol Other Than a Decentralized Exchange?
The Blocknet is designed as infrastructure for the emerging token ecosystem. Any service or orchestrated sequence of microservices provided by dapps may be delivered over the Blocknet's infrastructure.
Using decentralized exchange, these services are intrinsically monetizable, removing the friction and high costs of traditional payment networks - friction which has prevented the monetisation of the bulk of the API ecosystem.
Due to the decentralized exchange, consumers of a service may pay in their native token even if the service consumes a different token.
What Are the Benefits Of Running a Node? How Many Blocks Do I Need To Run One?
There are two types of node: a "service node" and a “trader node”. Service nodes do not handle or control any trader's coins. Their function is to collect and distribute trade fees. Typically a service node operator will run multiple full node wallets of whichever coins (s)he wants to support, in order to garner as many trade fees as possible. Trader nodes enable one to trade on the decentralized exchange.The amount of BLOCK currently needed to run a service node is 5,000 BLOCK. To use the exchange you will not need any BLOCK.
Will There Be Fees For Buying/Trading On the Blocknet Exchange?
Yes, there are fees, though they are significantly lower that centralised exchanges.
The fee structure is as follows:
Will A User Need BLOCK To Participate On An Exchange?
No, to use the exchange you will NOT need any BLOCK. Only the service node operators will need BLOCK in order to collect and distribute trade fees. Additionally, the service nodes do not handle or control and trader’s coins. The sole purpose of the service node is to only collect and distribute trade fees.
Staking
Staking and fees on the Blocknet are bundled together in a 70/30 split between nodes and stakers. This is a combination of POS staking and network trading fees. Staking is estimated to be between 9% - 14% in the first year. Nodes will receive 70% and stakers will receive 30%. This means that if you do not have enough Block to run a node, you will STILL get part of the node fees, and if you run a node, you will also get part of the stakes as well. Your wallet must be unlocked to actively stake and receive rewards. There will be 525,600 new blocks created annually (at 1 block per minute) with decreasing inflation each subsequent year.
Block Specs
Core Team
Set up guides
Links
submitted by Blocknet to theblocknet [link] [comments]

Nice Article About How HPB Perform Vs EOS (and so ETH)

HPB: Unique Blockchain Infrastructure
Now most public chains will mention that the problem of tps development is the problem of the blockchain. This is also because the traditional blockchain has the problem of poor performance. In order to reach consensus, the efficiency is sacrificed. But if you want to build an ecosystem of countless DAPPs based on the public chain, there is no guarantee of performance that is almost impossible.
The dream of building a DAPP ecosystem is that Bitcoin has not been completed and it is not necessary to complete it. Bitcoin is only a digital currency and it has initially fulfilled its historical mission. It has become a value storer, and it has opened the world of the blockchain. .
Ethereum started with the goal of building a world-wide computer that provided the infrastructure for building decentralized applications, but so far it has only succeeded in the crowdfunding field. Due to performance, cost, scalability, and other issues, it is not yet possible to become a DAPP infrastructure. By the end of 2017, a simple encrypted cat game would have caused Ethereum to jam. Ethereum tried to get rid of the predicament through techniques such as fragmentation, Plasma, and PoS consensus.
Newcomers, such as EOS, are highlighting their high performance, emphasizing the possibility of reaching mega-level tps. Then, in the future, an infrastructure is needed to build a prosperous DAPP ecosystem on this decentralized infrastructure to meet the user or business needs of different scenarios.
What kind of program is a better choice? This is what blue fox has been paying attention to. Blue Fox focuses on an HPB blockchain project that uses a completely different search path than other public chains or infrastructure. This path is worth paying attention to all the buddies who pay attention to the blockchain.
This path is a combination of hardware and software. It is more demanding and the practice is more difficult. However, if it is truly grounded, it may be a good path.
HPB to become a high-performance blockchain infrastructure
Whether HPB or EOS have the same goals, they must provide a high-performance infrastructure for the decentralized ecosystem. why? Mainly from the blockchain to the mainstream business scene point of view. The current blockchain has achieved some success in security and decentralization, but there are natural constraints in terms of efficiency. This hinders its application scenario to the mainstream.
This is also a direction that Blockchain 3.0 has been exploring. Through higher performance, lower costs, and better scalability to meet the needs of more decentralized application scenarios.
The current bitcoin and Ethereum's throughput are both worrying. Bitcoin supports about 7 transactions per second on average, and Ethereum has about 15 throughputs. If you make the block bigger, you can also increase the throughput, but it will cause the problem of block bloat. Last year, an encrypted cat game made everyone see the blockchain congestion problem. From a performance point of view, it takes a long time for blockchains to reach the mainstream.
In addition to the lack of tps performance, the transaction cost of the blockchain is high. Both ordinary users and developers cannot afford gas costs that are too high. For example, before Ethereum's crypto-games became hot, there were even transaction fees compared to encrypted cats. It is also expensive.
The HPB and EOS goals are similar, but their paths are completely different. HPB uses a combination of hardware and software, has its own dedicated chip hardware server, which makes it theoretically have higher performance.
HPB is also trying to create an operating system architecture that can build applications. This architecture includes accounts, identity and authorization management, policy management, databases, asynchronous communications, program scheduling on CPUs, FPGAs, or clusters, and hardware accelerated technology. Realizes low delay and high concurrency and realizes mega-level tps to meet the needs of commercial scenarios.
It is different from EOS. Its architecture, in addition to its software architecture and its hardware architecture, is a combination of hardware and software blockchain architecture that combines high-performance computing and cloud computing concepts. The hardware system includes a distributed core node composed of high performance computing hardware, a general communication network, and a cloud terminal supported by high performance computing hardware.
The core node supports a standard blockchain software architecture, including consensus algorithms, network communications, and task processing. It also introduces a hardware acceleration engine. It works with software to achieve high-performance tps through BOE technology (Blockchain Offload Engine) and consensus algorithm acceleration, data compression, and data encryption.
BOE makes HPB unique
In the HPB's overall architecture, compared with other blockchain infrastructures, there are obvious differences. One of the important points is its BOE technology.
BOE mentioned above, is the blockchain offload engine. The BOE engine includes BOE hardware, BOE firmware, and matching software systems. It is a heterogeneous processing system that achieves high performance and high concurrent computational acceleration by combining CPU serial capabilities with the parallel processing capabilities of the FPGA/ASIC chip.
In the process of parsing TCP packets and UDP packets, the BOE module does not need to participate in the CPU, which can save CPU resources. The BOE module performs integrity checking, signature verification, and account balance verification on received messages such as transactions and blocks, performs fragment processing on large data to be transmitted, and encapsulates the fragments to ensure the integrity of received data. At the same time, statistics work will be performed according to the received traffic of the TCP connection, and corresponding incentives will be provided according to the system contribution.
BOE has played its own role in signature verification speed, encryption channel security, data transmission speed, network performance, and concurrent connections.
The BOE acceleration engine embeds the ECDSA module. The main purpose of this module is to improve the speed of signature verification. ECDSA is also an elliptic curve digital signature algorithm. Although it is a mature algorithm that is widely used at present, the pure software method can only be performed thousands of times per second and cannot meet the high performance requirements. So the combination of BOE and ECDSA is a good attempt.
In the process of data transmission between different nodes, BOE needs to establish an encrypted channel. In this process, it uses a hardware random number generator to implement the security of the encrypted channel, because the seed of the random number of the key exchange becomes unpredictable.
The BOE acceleration engine also uses block data fragmentation broadcasting technology. Block fragmentation includes a complete block header, which facilitates the broadcast of newly generated blocks to all nodes. With block data fragmentation, network data can be quickly transmitted between different nodes.
The BOE technology can perform traffic statistics of node connections based on hardware, and can calculate network bandwidth data provided by different nodes. Only providing network bandwidth to the system will have the opportunity to become a high contribution value node. In this way, incentives for the contribution of the nodes are provided.
In terms of concurrency, BOE is expected to maintain more than 10,000 TCP sessions and handle 10,000 concurrent sessions through an acceleration engine. BOE's dedicated parallel processing hardware replaces the traditional software serial processing functions such as transaction data broadcasting, unverified blockwide network broadcasting, transaction confirmation broadcasting, and the like.
According to HPB estimates, through the BOE acceleration engine, the session response speed and the number of session maintenance can reach more than 100 times the processing power of the common computing platform node. If the actual environment can be achieved, it is a very significant performance improvement.
Consensus algorithm for internal and external bi-level elections
HPB not only significantly improves performance through BOE, but also adopts a dual-layer internal and external voting mechanism in consensus algorithms. It attempts to achieve more efficient consensus efficiency on the premise of ensuring security and privacy.
Outer election refers to the selection of high-contribution-value node members from many candidate nodes, and the election will use node contribution value evaluation indicators. Inner-layer election refers to an anonymous voting mechanism based on a hash queue. When a block is generated, it calculates which high-contribution value node preferentially generates a block. Nodes with high priority have the right to generate blocks preferentially.
So, how to choose high contribution value node? Here is the first indicator to evaluate the contribution value. The indicators include whether a BOE acceleration engine is configured, network bandwidth contribution (data throughput over a fixed period of time), reputation, and total node token holding time. Among them, the creditworthiness of the node is obtained through the analysis of participating transactions and data analysis such as packaged blocks and transaction forwarding. The total holding time of the node token can be obtained by real-time statistics on the account information.
The outer election adopts an adaptive and consistent election plan. That is, by maintaining the consistency of “books” to ensure the consistency of outer elections, this can reduce network synchronization, and can also use the data of each node on the chain. The first is to put the above-mentioned four evaluation indicators into the block. By keeping the account books consistent, you can calculate the current ranking of all the participating candidate nodes. The higher-ranking high-contribution value nodes will become the official high contribution in the next round. Value node.
With the formal high contribution value node, the goal of the inner election is to find the high contribution value node corresponding to each block as soon as possible. The entire process is divided into three phases: nominations, statistics, and calculations. These three phases combine security, privacy, and performance.
The first is the nomination. At the beginning of the voting period, the BOE acceleration engine generates a random Commit. The high contribution value node submits its Commit, and the Commit synchronizes with the chain generated by the high-performance node. After the voting period is over, the Commit in the blockchain is started and the ticket pool is created. The last is the calculation. The calculation is mainly based on the weight algorithm to calculate the node's generation priority in the block. Generate the highest-priority high-contribution value node and obtain the block package right.
Other nodes can verify the random number and address signature according to the principle of verifiable random function, which not only guarantees security, but also guarantees the unpredictability and privacy of high contribution value nodes.
In general, HPB's consensus algorithm combines security, privacy, and speed through a combination of hardware and software. Using the BOE acceleration engine to generate random numbers, contribution value evaluation indicators, coherence ledgers, anonymous voting mechanisms, weight algorithms, signature verification, etc., privacy, reliability, security, and high efficiency are achieved.
Universal virtual machine design: support for different blockchains
The HPB virtual machine adopts a plug-in design mechanism and can support multiple virtual machines. It can implement the combination of the underlying virtual machine and upper level program language translation and support, and support the basic application of virtual machines. In addition, the external interface of the virtual machine can be realized through customized API operations, which can interact with the account data and external data.
The advantage of this mechanism is that it can realize the high performance of native code execution when the smart contract runs, and it can also implement the common virtual machine mechanism supporting different blockchains. For example, it can support Ethereum virtual machine EVM. The smart contract on EVM can also be used on HPB.
Neo's virtual machine NeoVM can also be used on HPB. When high-performance scenarios are needed, users of both EVM and NeoVM need only a few adaptations to interact with other HPB applications.
The HPB smart contract has also made some improvements, such as the management of the life cycle, auditing and forming a common template. No progress can realize the full lifecycle management of smart contracts, such as the complete and controllable process management and integration rights management mechanism for intelligent contract submission, deployment, use, and logout.
In smart contract auditing, HPB conducts a protective audit that combines automated tool auditing with professional code design. In terms of templates, HPB gradually formed a generic smart contract template to support the flexible configuration of various common business scenarios.
Incentives for a positive cycle of token economy
When the high-contribution value node generates a block, it will receive a token reward from the system. From the design of the HPB, the system will issue a token of no more than 6% per year, and the additional token will be proportional to the total number of high-contribution nodes and candidate nodes.
In order to obtain the token reward from the system, it must first become a high contribution value node, and only the high contribution value node has the right to generate a block.
In order to obtain the right to generate a block, it is necessary to contribute, including holding a certain number of HPB tokens, having a BOE hardware acceleration engine, and contributing network bandwidth to the system.
From its mechanism, we can see that HPB's token economic system design is considered from the formation of a positive incentive system. It maintains the overall HPB system by holding the HPB token, having a BOE hardware acceleration engine, and contributing network bandwidth to the system. safe operation.
HPB landing: supports a variety of high-frequency scenes
In essence, HPB is a high-performance blockchain platform and is an infrastructure where various blockchain applications can be explored. Including blockchain finance, blockchain games, blockchain entertainment, blockchain big data, blockchain anti-fake tracking, blockchain energy and many other fields.
In terms of finance, decentralized lending, decentralized asset management, etc. can all be built on the HPB platform to meet high-frequency lending and transaction scenarios.
In terms of games, although all game operations are not practical, the up-chaining and trading of assets such as game props are important scenes. Once the realization of the game product chain, you can ensure that the game assets are transparent, unique, can not be tampered with, never disappeared, etc., providing great convenience for the transaction between the game products.
Compared with traditional centralized service providers, there are many advantages. For example, there is no need to worry about the loss, confiscation, or change of virtual game products. The transaction process is also simple and convenient. Since HPB has a high-performance blockchain, it is expected to support millions of concurrents, and many high-frequency scenarios can also be satisfied.
For blockchain entertainment, it can support the securitization of star assets, such as star-related token assets. In terms of blockchain big data, it can support the data right, ensure that the data owner controls the data ownership, ensure the authenticity of the data, traceability, can not be modified, and finally realize data transactions according to the needs of different entities. , to ensure personal privacy and data security.
Based on HPB's blockchain infrastructure, based on its high performance, blockchain applications can be built in multiple scenarios. The HPB design provides a blockchain application program interface and application development package. In the HPB blockchain base layer, it provides blockchain data access and interactive interfaces, and supports various applications and development languages ​​using JSON-RPC and RESTful APIs. It also supports multi-dimensional blockchain data query and transaction submission, and the interactive access interface can be integrated with the privilege control system.
The application development package includes comprehensive functional service packages that operate on blockchains based on different development languages. For example, it provides functional interfaces such as encryption, data signature, and transaction generation, and can seamlessly support integration and function expansion of various language service systems. , supports multiple language SDKs such as Java, JavaScript, Ruby, Python, and .NET.
Conclusion
If the future blockchain wants to enter the mainstream population, it must have high-performance public-chain or infrastructure support to form a true application ecosystem. Ethereum's dream to build a decentralized ecosystem cannot be achieved on an existing basis. Ethereum is trying to improve performance and expand scalability through fragmentation, plasma, and pos consensus mechanisms.
At the same time, the current status quo has also spawned other public-linked efforts, including eos, HPB, etc. Among them, HPB has adopted a unique combination of hardware and software, dedicated BOE hardware acceleration, signature verification speed, encryption channel security, data transmission Speed, network performance, and high concurrent support all have their own advantages over simple software solutions.
In the software architecture, consensus algorithms for internal and external elections, flexible virtual machine design, application program interfaces, and development packages are also used to provide infrastructure for the development of blockchain application scenarios.
From the overall design of HPB, its goal is to provide high-performance infrastructure for the entire blockchain to mainstream people. With a high-performance infrastructure, blockchains can only be implemented in many high-frequency scenarios to create more application ecosystems and have the opportunity to reach mainstream people.
The HPB team focused on the technical background, including the founder Wang Xiaoming who was an early evangelist in the blockchain and once participated in the establishment of UnionPay Big Data, Beltal, and Beltal CTO. Co-founder CTO Xu Li has more than 10 years of experience in chip industry R&D and management. He was responsible for the logic design, R&D, and FPGA chip marketing of the core products of the world's top qualified equipment suppliers and the world's largest component distributor. Technical VP Shu Shanlin once worked for Inspur, a well-known Chinese server manufacturer, as an embedded chief engineer, and has extensive R&D experience in embedded software and underlying software. Another co-founder, Li Jinxin, is a former blockchain analyst of Guotai Junan and has extensive experience in digital asset investment.
The background of the team members is in line with the HPB's soft and hard path. According to the latest monthly report, the basic PCB layout design of the BOE board, the overall architecture design of the BOE, and the ECC acceleration scheme have also been completed. At the same time, several tests have been completed for the BOE hardware acceleration engine.
It is hoped that HPB will develop rapidly and will embark on a path with its own characteristics in the future of blockchain infrastructure competition. It will provide support for more decentralized applications and eventually build a prosperous ecosystem.
Risk Warning: All Blue Fox articles do not constitute investment recommendations, investment risks, it is recommended to conduct in-depth inspection of the project, and carefully make their own investment decisions.
Source: https://mp.weixin.qq.com/s/RSuz6R6MTotEL_U__Al_Wg
submitted by azerbajian to HPBtrader [link] [comments]

Bitcoin JSON-RPC Tutorials - YouTube Bitcoin JSON-RPC Tutorial 5 - Your First Calls - YouTube Bitcoin JSON-RPC Tutorial 1 Programming Bitcoin-qt using the RPC api (1 of 6) Bitcoin JSON-RPC Tutorial 6 - JSON Parameters and Errors

The example shows how to create a JSON-RPC endpoint using WebOb and the simplejson JSON library. This also shows how to use WebOb as a client library using WSGIProxy. While this example presents JSON-RPC, this is not an endorsement of JSON-RPC. In fact I don't like JSON-RPC. It's unnecessarily un-RESTful, and modelled too closely on XML-RPC. The original bitcoin client does proper, full-precision rounding for all values passed to it via the RPC interface. So, for example, if the value 0.1 is converted to the value "0.099999999999" by your JSON-RPC library, that value will be rounded to the nearest 0.00000001 bitcoin and will be treated as exactly 0.10 BTC. The interaction between them happens via JSON-RPC over TCP port 8332. In order for them to recognize and trust each other, you need to set rpcpassword, which is written in the file ~/.bitcoin / bitcoin.conf as rpcpassword=blah-blah-blah. If you do not have such a file you need to create it. There you can register and other settings from those ... Browse other questions tagged json-rpc or ask your own question. The Overflow Blog The Loop- September 2020: Summer Bridge to Tech for Kids JSON-RPC. Running Bitcoin with the -server argument (or running bitcoind) tells it to function as a HTTP JSON-RPC server, but Basic access authentication must be used when communicating with it, and, for security, by default, the server only accepts connections from other processes on the same machine. If your HTTP or JSON library requires you to specify which 'realm' is authenticated, use ...

[index] [21576] [28191] [32060] [15412] [7349] [8592] [7093] [41237] [13233] [50397]

Bitcoin JSON-RPC Tutorials - YouTube

JSON RPC Calls with Bitcoin qt (4 of 6) Bitcoin JSON-RPC tutorial. How to set up and use bitcoind wallet notify feature. How to set up and use bitcoind wallet notify feature. My Book: Building Bitcoin Websites - https://www.amazon.com ... Bitcoin JSON-RPC tutorial. Getting started with the bitcoin command line interface. My Book: https://www.amazon.com/Building-Bitcoin-Websites-Beginners-Devel... An introduction to the Bitcoin JSON-RPC tutorial series. BTC: 1NPrfWgJfkANmd1jt88A141PjhiarT8d9U Bitcoin JSON-RPC tutorial. Making your first bitcoin JSON-RPC calls in PHP. My Book: https://www.amazon.com/Building-Bitcoin-Websites-Beginners-Development/d...

#