Network Enhancers - "Delivering Beyond Boundaries" Headline Animator

Monday, December 4, 2017

New OWASP Top 10 is finally here, injections still dominate

New Open Web Application Security Project 'top-ten risks' list highlights series of common failings, as well as a trio of new risks

The Open Web Application Security Project (OWASP) has revealed an updated ‘top ten’ most urgent application security issues facing organisations, a list that has stood for four years since the last update.
The infamous vulnerability ranking was last updated in 2014, and although not a formal standard has been widely adopted by security professionals and bug bounty platforms.

In spite of the four year hiatus, command injection is still the biggest threat to business applications today, followed by Broken Authentication and Data exposure.

There are three newcomers to the top ten, XML External Entities (XXE), Insecure Deserialization, and Insufficient Logging and Monitoring. The full top ten is here.

XML External Entities (XXE) has been added as a result of data from source code analysis tools, which have created a range of vulnerabilities including internal file or shares disclosure, internal port scanning, remote code execution (RCE) or denial of service (DoS).
Insecure deserialisation is also a new category, which can lead to replay attacks, injection attacks, and privilege escalation.

Finally, Insufficient logging and monitoring has made the list, as it makes it difficult for admins to detect and respond to attacks. The Project analysts pointed out in some situations it can take up to 200 days to detect a breach.

Other areas to watch for the future include architectural changes, according to the analysts, particularly in “single page applications” written with Angular or React, where functionality has been moved from the server side to the client which “brings its own security challenges”. Another new risk area is in microservices, where old code is put behind RESTful or other APIs, but was never designed to be exposed externally: “The base assumptions behind the code, such as trusted callers, are no longer valid”, the report said.

Ilia Kolochenko, CEO, High-Tech Bridge welcomed the new top 10, but also pointed out that the best practice principles of application security development are unchanged: "There are many different best practices to tackle application security at the early stage of software development. First of all, you need to ascertain that your developers are aware of the most recent secure coding techniques, and do properly apply them. Many qualified developers tend to ignore "minor" (in their perception) flaws, such as XSS or XSRF, putting the entire application at risk.”

Others candidly believe that if they have a WAF or RASP, no code security is ever required. Therefore, regular training on new attacking techniques and exploitation vectors are very helpful to highlight the practical side of security.”
Application security starts with secure and clear design of the application and related components. Before starting development, you need to plan how to handle security, reliability, compliance requirements (if any) and data encryption. Otherwise, any substantial corrections at later stages can easily get extremely expensive.
Don't forget to implement continuous security monitoring and testing once the application is deployed to production: what is secure today - can easily become vulnerable tomorrow”, he summarised.

Saturday, October 14, 2017

Bitcoin Transactions: How they work

Monday, October 9, 2017

Difference Between PoW, PoS, and PoI

Not a day goes by without hearing these two terms when you are involved in cryptocurrency and blockchain technology, PoW and PoS(there is a third one that you don't hear about that much, but you will get to know it here). when you are trying to find the next best choice for investment in cryptocurrency, one of the factors that could affect your portfolio is the algorithm that the project is using(except Bitcoin which has a totally different situation). let's go through each one and see how they work.

Proof of Work
PoW(Proof of Work) algorithm mainly is based on computation power, it means you are as powerful as the number(and power) of your CPUs. This algorithm is the oldest and the most common one in blockchain technology, one of the problems that PoW has, is that if you want to mine coins, you need to have powerful devices specially made for this purpose which cost some money, and also you need to spend a great amount of electricity and bandwidth over the process of mining, the whole thing does not seem profitable and in reality that is correct(if you don't have proper equipment, you probably will spend more that you earn).

Another problem other than wasting energy is, in future when there is no coin to mine anymore, miners only get paid by transaction fees, and because the mining process is very expensive, they will accept any fees which will lead to having fewer miners and finally being prone to 51% attack(when a miner have 51% or more percent of mining power and is able to change data in blocks), another problem was that the rich would get richer because the more resource you provide the more coin you will mine.

Proof of Stake

As you see PoW is not the best algorithm for blockchain, so another algorithm was developed that works completely different and makes you powerful as your stakes, in another word the amount of the coins that you earn is related to amount of stakes that you hold, for example, if you have 10% of a coin in your account, you would earn 10% of new coins that are created in future(the concept is simplified, there are more details in technical terms) because the probability of you signing next block would be related to your amount of stake. in this case you don't need to solve very hard mathematical challenges anymore which prevents wasting many resources like electricity, and also if someone wants to own more than 51% of the stakes to do the attack, he/she has to buy 51% of the stakes which is practically impossible because when he/she starts buying all the coins, the price would go to the moon and he/she won't be able to afford all that money.

Again one problem here is the person that holds the oldest set of coins or have more, gets more(rich gets richer, the same as PoW) which is not everybody's favorite condition,the only thing that needs to be done is to prove the ownership of your stakes or the number of the coins you possess. so what should we do?! the answer is already found actually. by the way, the first coin that implemented PoS was PeerCoin(PPC).

Proof of Importance

This is where PoI comes in, because of the problems that were mentioned above, PoI algorithm came around, the idea behind this algorithm is that you are important as your activity on the network. it mean those who are active on the network and help project to benefit(loyal users) are going to be rewarded, each address(user) is given a trust score, and by activities on the network it gets higher, the higher it gets, the more chance you will have to be rewarded by the network, this way rich necessarily does not get richer and everyone has the chance to be rewarded based on loyalty and effort of the user. as I said before the whole thing is being simplified to get you a general and basic understanding, it has more technical complexity in it, though.

the coin that is using this algorithm is called NEM(XEM) which is one of the promising coins for long-term investment exactly because of this well thought algorithm.

Wednesday, September 20, 2017

OpenStack and OpenShift

Both OpenStack and OpenShift Origin are open source projects, and both provide cloud computing foundations. However, they do not compete with each other.

OpenStack provides “Infrastructure-as-a-Service”, or “IaaS”. It provides bootable virtual machines, networking, block storage, object storage, and so forth. Some IaaS service providers based on OpenStack are HP Cloud and Rackspace Cloud. Red Hat is fully engaged with the larger OpenStack user and developer community. Our developers are the source of a significant amount of code to implement features and fix bugs in the OpenStack projects.

The OpenShift hosted service provides “Platform-as-a-Service” or “PaaS”. It provides the necessary parts to quickly deploy and run a LAMP application: the web server, application server, application runtimes and libraries, database service, and so forth.

OpenShift Origin is the open source project of the software that enables the OpenShift hosted service. Using OpenShift Origin, you can build your own PaaS.

A PaaS typically runs on top of an IaaS provider. For example, both the OpenShift hosted service and the Heroku hosted service run on top of Amazon’s AWS IaaS service.

OpenShift Origin can run on top of OpenStack. For example, our Krishna Raman has written a blog post: OpenShift Origin on OpenStack which describes step by step how to set it up.

In addition to OpenShift Origin being able to run on top of OpenStack, it can run on top of AWS, it can run on top of KVM or VMware in your own data center, it can run on top of Virtual Box in your personal laptop, and it can even run on top of “bare metal” unvirtualized Linux hosts.

So, when I am asked “How does OpenShift relate to OpenStack?”, I answer “OpenShift Origin can run on top of OpenStack. They are complementary projects that work well together. OpenShift Origin is not presently part of OpenStack, and does not compete with OpenStack. If you stand up your own OpenStack system, you can make it even more useful by installing OpenShift Origin on top of it.”

Wednesday, August 23, 2017

How to set up an all open-source IT infrastructure from scratch

Courtesy - Bryan

Hypothetical: You need to set up the IT infrastructure (email, file sharing, etc.) for a new company. No restrictions. No legacy application support necessary. How would you do it? What would that ideal IT infrastructure look like?

I decided to sit down and think of my ideal setup — based on quite a few years of being a vice president of engineering at various companies — and document them here. Maybe you’ll find my choices useful; maybe you’ll think I’m crazy. Either way, these are good things to consider for any organization. 
Run services on your own servers 

The first thing I’m going to decide on, right up front, is to self-host as many services as I possibly can. 

Sure, there are noteworthy benefits to paying another company host and maintain your services for you (primary that you don’t have to have someone on staff to perform that function) but the drawbacks far outweigh the good points. 

Having full control over your own data — how it is stored and who it is shared with — is critical to any business (and any individual). Most of the choices I make below would also work as a remotely hosted option. But, where possible, I will focus on them being self-hosted. 

Some of the following functionality can be hosted on a single server, but I recommend breaking out key services to run on dedicated servers — possibly many, depending on your particular needs (such as an expectation of large file repositories) or large numbers of employees. 

Only open-source software 

For security and customization reasons, I will be opting to utilize only open-source and free software here. There are simply far too many drawbacks to basing a corporate infrastructure on closed source systems. 

This decision was easy and obvious for anyone who’s worked in IT for more than a few years. 

Kolab for email and calendaring 

For email, calendaring and general groupware functionality (meeting requests and the like) I opt to go with Kolab. It’s open source, and there’s a company behind it that will provide paid support as needed or desired. 

Kolab has a great web interface for all of the key functionality, but it will work just as well with almost any email and calendar clients in existence.

Owncloud or Nextcloud for file sharing/document collaboration

Since we’ll be going all open source, file sharing (and online file storage) options such as Dropbox and Google Drive are simply not an option. 

There are some features along these lines built into Kolab but not quite enough. I’d like something a little more powerful and extensible, which means running either Owncloud or Nextcloud

The two systems are very similar in many respects — not surprising because Nextcloud is forked from and run by the founder of Owncloud. Both will, in all reality, meet most file sharing/storage needs quite well. 

However, OwnCloud does contain some closed source bits focusing on larger organizations. On the flipside, NextCloud has made a public commitment to offer all features as 100% free and open source software. With that in mind, I would opt to go with NextCloud. 

As an added bonus, NextCloud handles document collaboration quite well via Collabora Online. Two birds, one stone. 

Matrix for instant messaging 

No. Using Google Hangouts is not a reasonable option for your company’s instant messaging. Neither is Skype. We need something that a) can be hosted in house, b) is open source, and c) is as secure and private as possible. 

I’ve opted to go with Matrix. Not only does it check all of those three key criteria, but it has two rather interesting features that, while may not be used, are nice to have around as options: 
  • A decentralized design. Meaning that, as the organization grows, new server instances could be added, say, for different parts of the company or different locales. 
  • The ability to bridge Matrix to other services, such as IRC, Slack, etc. This can make it much easier to integrate with external teams or communities.

Again. Maybe those your organization will never use those two features, but having them around doesn’t hurt.

Bonus points: Matrix handles video chats. Got a big, remote team? If everyone’s on Matrix, there’s no need for company-issued cell phones (or land lines). 

Linux-based OS and software for workstations

Not choosing Microsoft Windows is the first obvious decision here. The cost is to high (both in terms of up-front monetary investment and recurring costs associated with securing a closed platform). MacOS is, for the same reason, off the table. 

What specific platform I chose, at that point, comes down to what my specific needs are within the organization. Chance are I would select a Linux-based platform (either a free Linux distribution – Debian, openSUSE, Fedora, etc. – or a similar system with paid support). Support is the main reason to consider a paid, closed system anyway, so might as well get all the benefits with none of the drawbacks of a system like Windows.

Save money, increase security. No brainer. 

For applications, I’d also standardize around LibreOffice for the office suite and one of the several open-source web browsers (such as Firefox).

In short: All open source 

Clearly an all open-source workplace makes the most sense. Save money. Be more secure. More flexible. Those are all good things.

If you’re reading this and you are responsible for making IT decisions within your company, remember all of these when it comes time to renew your Microsoft Exchange license. Or it’s time to upgrade Windows. Or pay for yet another month/quarter of your video conferencing and file storage system.

Maybe my specific choices here won’t match your needs exactly, but for most of you, there are going to be open-source solutions that will.

Tuesday, August 22, 2017

Ethereum Versus NEM - The Obvious Choice


Lots of new projects and startups are coming into the crypto space. Right now, more than 10 ICOs are running at the same time, and every day we see more and more coming.

It is very important to focus on the planification and procedures of everything related to blockchain technology when we want to start a new project. The money and confidence of thousands of investors is one of the most difficult things to achieve in this early stage, and The DAO incident is still present in the minds of every single member of the ecosystem.

Every entrepreneur in the world knows how difficult is to run a startup, and one of the most important things to have in mind is that the confidence of the clients is really difficult to achieve and really easy is to lose, and this behavior is even more accentuated in our sector.

The blockchain ecosystem has long had massive software catastrophes one after another, and this is probably one of the reasons for its slowed expansion. Because of this, we need now, more than ever, a development framework which lets us build robust applications with the highest level of security possible.

Nowadays, Ethereum is the blockchain system which provides us more flexibility to build Smart Contract based systems. It has a broad range of development tools made by and for developers, and some of its architecture and security standards have been well defined.

However, Ethereum contracts are still like legal contracts in the ancient Greek age, where humans had some “secure” and standardized contracts but when they had to write more complex contracts, lots of tricks and backdoors appeared.

This sticking point for lots of developers and entrepreneurs is one of their highest pains right now.

We can compare Ethereum with the Android OS. Android, for example, is really useful because it lets developers make mobile apps easily, but one of the tradeoffs for this is the security. Lots of phishing apps have been discovered, and lots of money has been lost because of this broad freedom to create mobile apps provided by Android.

We also spend lots of time thinking about how to promote adoption on blockchain technology. For this to happen, we probably need some more simple applications which are more secure, so we can promote its use while developing more complex systems and testing them.

Here’s where NEM kicks in. NEM was built from scratch and has been tested by many banks and big corporations. With extensive enterprise developing experience under their belts, several full-time NEM developers write and execute thousands of tests before every development cycle and release, and this gives NEM a very secure core.

NEM’s approach is to let developers use a wide range of combinable functionalities which let them build powerful applications based on a closed set of atomic operations, and opens the network to almost any technological combination thanks to its REST API. Notably, mix matching and combining Namespaces (unique domains), Mosaics (customizable assets), 2.0 multisig contracts, and three forms of messaging, allows for a wide variety of application frameworks to be built. By assigning meaning to these different functions and combining these in various ways a wide range of applications are currently under development including apps for transmitting financial value, notarizations, tracking and logistics, voting, land management, ID management, and more.

NEM's architecture makes it very simple for developers to build blockchain applications in almost every device with the same degree of decentralization and security. To give an example, the NEM NanoWallet can run on a desktop and smartphone without any problem and can be used as a boilerplate to build more complicated applications making customized apps built with it accessible on Windows, Mac, Linux, iOS, and Android.

Another critical thing for a decentralized application is scalability. Right now, Ethereum can have at most, near 15 transactions per second, while NEM can scale to hundreds of transactions per second, and has already been tested privately and independently to scale to thousands of transactions per second in the Catapult release. Catapult is currently in a closed beta and is scheduled to go live in 2017 if all goes well.

But the most important thing is the following; NEM has never had a serious security issue. The commitment of the NEM Foundation and the core developers with regards to the security and the availability of the network is of utmost importance, and this gives to the community, third-party developers, and investors a guarantee which is hard to equalize.

No one doubts on the strong potential of both technologies, and right now NEM seems to be a more stable choice for building new applications which have to support real business models and do so now. Its security and development facilities can let blockchain entrepreneurs focus on relevant problems and not in technical difficulties, and its learning curve is far and away smoother than the Ethereum.

NEM is great for lots and lots of applications that many people would want and it is easy to build on. Ethereum is better for very specialized projects but needs somebody who has a very high level of skill to do it exactly right. Whereas, in NEM, it’s already done for you, so you can't really hurt yourself if you go wrong.

Monday, August 21, 2017

What is Proof-of-Importance (POI) and Why Is It Better, and What Is Vesting?

What is POI?

Proof-of-Importance, proof-of-work, and proof-of-stake all have one thing in common. They are all algorithms, which when applied to cryptocurrency help to maintain the order in which blocks are selected. This becomes important when we start to think of things such as double spending. This is where money is spent more than once (fraudulently). For example, some currencies use verification of each transaction in the blockchain to prevent this.

To understand why NEM's idea of POI is so revolutionary, we first need to understand how POW and POS work.

POW (Proof-of-Work) was the first system to be implemented and is used by cryptocurrencies such as Bitcoin and also Scrypt coins such as Dogecoin.

In order to “earn” these cryptocurrencies, you must use your computer to mine the coin; the greater your machine’s power, the bigger chance you have of earning.

Why did they make it so that making a block was both expensive and time-consuming? As it requires large computational power to make a block, attacks on the blockchain become harder to carry out as the attacker would have to use an unfeasible amount of resources.

Note that very many cryptocurrencies (including NEM) have some sort of blockchain explorer, which allows anyone to see any transactions as well as the mining of blocks.

Blockchain technology can also be used for file sharing and proof of asset ownership, and many other things!

However, it was not long before people realised the obvious problems.

Mining (the process in which computational power is used to make new blocks), has very little use.

As technology gets better, people have to spend more money to get the latest ASICs (machines specifically for mining), meaning, even more energy is wasted.

It is pointless to try mining with a CPU. You are competing against companies with rooms of ASICs, and electricity costs mean it is a waste of time. For example, if you had a decent CPU hash rate of 0.1kH, in one week you would not even make a cent!

Another problem is that as the rich can afford expensive ASICs, they only get richer and richer. In other words, wealth is spread very unevenly, with the top 1% in Bitcoin holding 80% of all Bitcoins (starts from 2014). Many of the richest do not actively use their money, meaning that they are contributing very little to the community.

This was why the POS (Proof-of-Stake) system was introduced. It was implemented first by the well-known Peercoin cryptocurrency. Instead of conventional mining, it asks participants to prove ownership of their “stake,” or how much Peercoin they possess.

Larger and older sets of coins have a higher probability of signing the next block, and a lot of computing power is saved.

Again, however, there are problems. Richer users are more likely to sign the next block, and the more blocks they obtain, the richer they get. The problem is the same, richer users will gain wealth much faster than others.

This is where NEM comes in. Its POI system not only rewards those with a large account balance, but also takes into account how much they transact to others and who they transact with.

This means that those who actively help the economy and therefore NEM benefit, meaning the right people, are rewarded. Each user is given a trust score, the higher it is, the more chance they have of being rewarded.

The good thing is that this will mean much more even wealth distribution; anyone who contributes can gain extra XEM (the currency of the NEM network). NEM is great because it gives similar opportunities to everyone. The main aim is to empower regular people.

This rewarding is done through harvesting, a process in which a node will calculate blocks and they are added to the blockchain.

To do this you need a vested balance of 10,000 XEM.

But what is vesting?


When a person first deposits XEM into an account, none of it will be vested. After 24 hours, 10% of the balance becomes vested. After the next 24 hours, another 10% of the remaining unvested balance is then vested. This cycle carries on as long as the XEM is kept in your wallet. If you make a transaction, both vested and unvested coins will be used so that the ratio of unvested:vested coins remains the same.

The point of this is to build up trust; you have held your coins for a while or you have a very large amount (for example, if you have 100,000 NEM, it will take only 24 hours for you to have 10,000 vested coins).

Once you have enough vested coins, you can mine either through local harvesting or delegated harvesting.

Local harvesting is much easier to setup, but has more disadvantages than delegated harvesting.

To start local harvesting, select harvested blocks from the left-hand menu, then click “Start local harvesting.”

If you are interested in setting up delegated harvesting, please go here

Cloud Native Landscape

What's the difference between XEM, BTC, and ETH?

Ever since the dawn of currency, currency was controlled by a central entity. This central entity could decide to do whatever it wanted with its currency. It could weaken it, strengthen it, take it away from you, anything. The money was only valuable because this central entity said it was. The sad part is, we're still using this form of currency today - in the form of your dollars, euros, yen, or anything of that sort.

In 2008 a man calling himself Satoshi Nakamoto decided he wanted to fix all this, and created the original cryptocurrency - Bitcoin. Bitcoin was a great and innovative idea, and it created the idea of the blockchain. The blockchain is a public ledger of every transaction that ever occurred, and as such could be verified by anyone.

Now it's 2016, and there are hundreds of other cryptocurrencies out there, so I'm going to explain to you the pros and cons of the larger ones, and why XEM is really the way to go.
Bitcoin was the original. As we have seen, the original is not always best - but it still was innovative. It uses a public ledger called a blockchain for security, but that's about the only security measure added.

The ideas in Bitcoin are applied to both Ethereum and NEM, and a simple rundown of all of this can be found in this video, created around the time bitcoin started becoming popular.

The ideas behind bitcoin have been used in every cryptocurrency since, so it’s important to understand how a transaction in Bitcoin works.

As far as all the advantages of Bitcoin, NEM and Ethereum both do whatever Bitcoin can, but better.

If you want to know how a transaction works in Ethereum, look at the infographic about Bitcoin, it works the exact same way.

Ethereum is really big right now because it includes two main features over Bitcoin.

  • Smart Contracts allow you to write applications in the blockchain that usually run as programmed.
  • 'ASIC-proof' algorithm makes it profitable to mine for people without expensive hardware.
The asic-proof algorithm is still proof-of-work however, and so it suffers from the same exact pitfalls that Bitcoin suffers from.

The cryptocurrency community really loved smart contracts for a while. The way it was advertised was absolutely brilliant. "World Computer." "Applications that run as programmed with no possibility of downtime." Except, maybe it works a little too well.

Recently the largest smart contract in Ethereum was hacked, due to a fatal flaw with Solidity and how smart contracts work. This guy explains it well, which notes that if there is a smart contract vulnerability (which we just saw happen in an audited smart contract) - the hacker can legally take off with the funds. This is absolute heaven for a hacker.


NEM uses PoI, also known as proof-of-importance. This means that (unlike Bitcoin and Ethereum), NEM is environmentally friendly, and more secure. Unlike mining Bitcoin and Ethereum, network upkeep does not require hundreds and thousands of electricity hogging mining machines. A NEM node can be run on a computer as simple and cheap as a Raspberry Pi, which is only $35 and takes up very little electricity. PoI also encourages people to actually use NEM, rather than just hoard it. For a more detailed explanation, check out the previous article comparing PoW, PoS, and PoI.

NEM is also superior in security. It uses EigenTrust++ for node reputation, which is not used in any other cryptocurrency, and strengthens the security of the network considerably. It also uses localized spam protection, which shuts down spammers, and only the spammers, when the network is at full capacity. Both are only found in NEM.

NEM was built with a two tier design in mind as well. If you want a wallet, you don’t need a full node and a copy of the blockchain. Instead, you can just connect to any node, and have access to all the same features without trusting it. Even a malicious node has no access to your funds, and the worst it can do is just not work. In order to make sure that nodes continue to operate, the developers created a supernode program, which gives a greater incentive for people to maintain the network for years to come.

NEM isn’t only better in the security aspect, however. It also brings a lot of new or improved features to the table. Unlike Bitcoin, multi-signature accounts are on the blockchain, and do not require trusting a third party in order to have a multi-signature account. Ethereum does have contracts, but you need to write it yourself, which means that pretty much only developers can do it. As mentioned in the Ethereum section, this can be very, very hard to do right due to the language in which smart contracts are written. NEM has made making or editing multi-signature contracts as easy as a few clicks.

Another advanced feature is mosaics. This works similar to colored coins (custom currencies) in bitcoin, but is done completely on chain, rather than requiring the trust of a third party. The names of these colored coins are based off of namespaces, which are similar to how domain names work on the internet. Once a namespace is created, no one can claim the same one, and the owner can make unlimited subdomains.

A platform is never complete without messaging, however, and NEM includes either encrypted or unencrypted messaging between addresses, completely through the network. There’s even hex messaging for developers.

While Ethereum and Bitcoin are rewarding miners for making blocks, they aren’t giving incentives for running full nodes and supporting network throughput. NEM has a program called Supernodes that rewards people for running high powered nodes that serve light wallets with data quickly and securely. These rewards were set aside during the first block of the NEM network and are planned to last for years. In the event that the supernode funds do run out, there is always incentive to maintain the network. Anyone with 10,000 XEM can make a harvesting node, and collect transaction fees based on their PoI score. And instead of having to buy powerful and expensive mining equipment that uses high amounts of electricity, NEM harvesters can run a node on a computer as simple as a Raspberry Pi.

We are the first private/public blockchain, which is the same system that was used to create Linux, widely accepted as the most secure OS in the business world. NEM was built by experienced developers and was built for scalability and stability from day one. We are also currently the only platform that has been stress tested by banks and approved for financial use. Other currencies have been tested, but haven’t shared any proof, but all of our tests are open for anyone to see the results for.

NEM also has tried to make it as easy as possible for third parties to build on the blockchain. In platforms like Bitcoin most third party developers are locked into using one centralized service like Coinbase or Bitpay to build their ecosystem. This means that they rely on these services to build, update, and maintain APIs. And at Ethereum each developer will write their own code for contracts, which is much more versatile and flexible but as mentioned before comes with a risk that the developer must know exactly what they are doing. NEM on the other hand offers a full set of rich and easy to use JSON/restful APIs that work across the entire network with any node, and work with a large variety of calls including all transaction types.

All of this was built from 100% new code, and as such does not hit any of the pitfalls of the other two platforms. However, it can still benefit from the advantages of the other platforms, as it also uses the blockchain.

If you skipped all of that because you don’t like giant walls of text, here’s an easier to digest infographic.

Sunday, August 20, 2017

"C" is the center of Security

After every major breach, there is a wave of information on security technologies which could have prevented it. Conferences have hundreds of companies exhibiting their cool technologies, differentiators and success stories. The research firms publish informative findings, guides, quadrants, waves etc. and most products and solutions are good if implemented and used well. So, in some ways, we have abundance of security controls and technologies. 
We often talk about coming together to fight the bad actors. Industry groups of companies sharing experiences, Government and industry partnerships etc. are examples. These initiatives have been successful to the extent possible considering business and competitive pressures. 
There is a need for collaboration between security vendors to help clients manage risk. Risk managers don’t create a single dimensional posture with dependence on one technology, that goes against the basic principles of risk management. So, naturally the collaboration between security technologies is important.
This collaboration, the “C” in Security, is manifesting itself in three models:
  1. The Security Marketplaces: Large platform players have marketplaces (Splunkbase, IBM AppExchange, Cisco etc.) enabling other security products to publish their “apps”. The idea is clients of the platform's can simply download the apps for desired functionality. 
  2. The Security API’s: Companies have published API’s, enabling relevant integrations with other technologies. These API’s typically provide information & data which can be ingested for enhancements or actions.
  3. The Security Apps: To enhance its functionality and value, companies are creating apps for the marketplaces & integrations to the API's. This provides an excellent platform for niche companies to join the ecosystem, the Digital business of security.
The good news is that these models are not just driven out of the “good to have” need of collaboration, but have the “must have” commercial model and are utilizing the Digital way.
The “C” in Security is here to stay and grow. 

Three most Common Cloud Native Development and Deployment Model

1. Kubernetes: (Containers) Full control over infrastructureand maximum portability.

2. CloudFoundry: (Applications) Focus on the applications and let the platform handle the rest.

3. Apache OpenWhisk: (Functions) Auto-Scaled, event-driven applications that respond to a variety of triggers.

Sunday, August 13, 2017

Blockchain is the Future of Healthcare

Everything we know about health care can benefit from blockchain technology. Especially when it comes to medical records, sharing patient information, and making data more interoperable. Right now, there are a lot of intermediaries involved in sharing patient data as very few systems are compatible with one another. The blockchain will effectively create a new model for the exchange of health information. Electronic records are the way forward. However, they need to be made more efficient, secure, and no longer reliant on intermediaries.

Overcoming all of those challenges will not be easy, though. A blockchain-powered healthcare ecosystem would raise the bar as far as interoperability is concerned. Eliminating friction and reducing costs associated with using current systems will create a healthier ecosystem for all parties involved. Now is the time to capitalize on this technology and its momentum.

Additionally, using blockchain will help in generating valuable insights. Being able to see the proverbial bigger picture will benefit both patients and healthcare personnel alike. Moreover, it will help improve the value of care as well. All of this will require a universal blockchain system connecting health care facilities from all over the world. Using private blockchains which only require approved partners to examine and share data is not a viable strategy for the healthcare sector by any means.

“While digital technology has dramatically transformed the management of our financial information and transactions, most of us have not seen such gains when it comes to our health data. As the world watches the adoption of blockchain as a new model of decentralization and security for financial services, we believe the time has arrived when the health sector can finally offer the freedom to own our health data using a smart wallet to manage private and public keys with different levels of permission. This will enable patients to share data on an as needed basis with trusted providers.”

– Dr. Rhea Mehta

There are indeed economic, technical, and behavioral challenges that are faced by the current healthcare models. No one has built a large-scale blockchain ecosystem for the healthcare sector to date. That situation will come to change sooner rather than later, with companies like Bowhead Health working toward bringing blockchain technology to the medical sector as we speak. These unique opportunities need to be taken advantage of with properly developed proofs of concept.

Saturday, August 12, 2017

Open Source Dashboard Softwares - Business Intelligence

Visualized history of the cryptocurrencies

Ultimate Cryptocurrency Guide

Contents -

Tuesday, August 1, 2017

Why the DEVOPS model is here to stay

The DEVOPS model may seem like just another one of those terms which management gets enamored with every few years but it is much more than that. The DEVOPS model will probably be the main way that development is done going into the future. We are sure that not only will it be the main way to develop software but that it will bleed out into other industries as well and become the main way of designing and developing new products and services. Here are a few reasons why.

DEVOPS enables user-centric products
How many times have you been using software and gotten frustrated and asked “Who the hell made this software?”The answer is “a person who will never use the software”. Bad design almost always comes from developers and designers who don’t really know the use case of the software. DEVOPS involves users in the development phase of the software which allows it to be user-friendly at its core. This is why DEVOPS is so important – it allows the people who will end up utilizing the software make important contributions in its creation and design.

We cannot expect developers to know everything
There is a huge problem in software design and development which DEVOPS solves. Development can only be done by developers. However developers are almost never the end users of the software they are creating. Imagine that you want a software to be made which helps architects design houses and buildings. Who will you hire to build the software? Developers, obviously. Yet at the same time you cannot expect developers to know about architecture as well.

This problem is replicated in pretty much every industry out there. You want software that helps the agriculture industry? Developers don’t know anything about agriculture and agriculture experts don’t know anything about software. The DEVOPS model solves this big problem by enabling the users and developers to work together. If there are any fundamental flaws in the way the software operates they can be eliminated at the design phase before the developers waste their time creating software that is of no use to anyone.

It increases success levels
Due to the reasons discussed above, DEVOPS results in software that works better than ever before. There are many horror stories of companies that spent millions developing software and solutions that their employees could never use properly and they ended up taking a huge loss. DEVOPS ensures that horror stories like this won’t happen. Developers are free to focus on the features that will be the most helpful for users because the users are right there to tell them what they need. There is no guess-work involved and thus there is a much, much lower chance that things will end in failure. DEVOPS isn’t just a new way to work – it is the understanding that the mistakes which happened previously happened because of the information and communication gap between users and developers. The sooner we close this gap the better it will be for all of us.

My Blog List

Networking Domain Jobs