Are you suffering from Accuracy Syndrome?

If you have seen your model giving a great accuracy when testing classification model and validating with future real-time data, and still the model is not able to predict the events accurately, it means you are suffering from accuracy syndrome.

It is detrimental to push the models for decision making. You need to work on your model little more.

It happens most of the time when the
predictor classes are not balanced.

Following are some of the scenarios, where the predictor class can be disproportionate:

1) Redeeming the offers issued by retail stores – Usually people have the tendency to forget offers very soon. Even though retail stores spend a lot of money every week on sending promotions and offers to their customers, in practice they are reluctant availing these offers. A very small proportion of customers come back to store for redeeming the offers and translating the offers actually into purchase.

2) Incidents taking place in old-age home – In old-age home, where care takers are experienced and staffs are efficient to do their work responsibly, taking place of any incidents are very rare. If you get a project to predict the incidents in coming week / month, you will get a very few incidents in comparison to non-incidents.

3) Detecting malignant tumours – From the imaging department of a hospital, when you receive a project on making a supervised model for finding malignant tumours, you may come across a negligible number of cancerous tumours. Most of them are benign. There may be a scenario when the model will give a high accuracy model.

There can be so many other cases, when predictor classes are disproportionate. This needs you to work little differently while working on modelling.

Following are the steps to come over accuracy syndrome:

1) Check the proportions of the prediction class. There is no threshold beyond which it will be called disproportionate. This has to be decided by speaking to the domain expert of your organization. The scenarios mentioned above have class proportions in the ratio of (99.2% Vs 0.8%), (95% Vs 5%) and (98% Vs 2%) respectively.

2) Separate out enough of data points, which will be used as a testing dataset. Objective of separating out testing dataset before preparing training dataset will be to test your model work on real time scenarios.

3) There can be two ways to tackle this problem,

a) Under-sampling: When you have enough data, larger class can be under-sampled to match the proportion of smaller class. A calculation is given below:

Inference: For Proportion 3, Accuracy is highest. Hence model will be built on the training set, which will have 60% as success incidents and 40% of failure incidents. Once the model is built all parameters like Precision, recall and F1 score along with area under curve will be calculated. Also, confusion matrix will be created to calculate the accuracy. Based on these parameters on testing dataset created in step 2 decision will be taken.

b) Over-Sampling: Be careful, while over sampling the smaller class/proportions. It will have repetition of rows to make it over sample. This will be a problem when the smaller class is very small. There may not be enough datapoints for learning the training model.

Robotic Process Automation

Robotic process automation (RPA) is the application of technology that allows employees in a company to configure computer software or a “robot” to capture and interpret existing applications for processing a transaction, manipulating data, triggering responses and communicating with other digital systems.

Any company that uses labour on a large scale for general knowledge process work, where people are performing high-volume, highly transaction process functions, will boost their capabilities and save money and time with robotic process automation software.

Just as industrial robots are remaking the manufacturing industry by creating higher production rates and improved quality, RPA “robots” are revolutionising the way we think about and administer business processes, IT support processes, workflow processes, remote infrastructure and back-office work. RPA provides dramatic improvements in accuracy and cycle time and increased productivity in transaction processing while it elevates the nature of work by removing people from dull, repetitive tasks.

Blockchain Technology

A blockchain is a digitized, decentralized, public ledger of all cryptocurrency transactions. Constantly growing as ‘completed’ blocks (the most recent transactions) are recorded and added to it in chronological order, it allows market participants to keep track of digital currency transactions without central recordkeeping. Each node (a computer connected to the network) gets a copy of the blockchain, which is downloaded automatically.

The blockchain is perhaps the main technological innovation of Bitcoin. Bitcoin isn’t regulated by a central authority. Instead, its users dictate and validate transactions when one person pays another for goods or services, eliminating the need for a third party to process or store payments. The completed transaction is publicly recorded into blocks and eventually into the blockchain, where it’s verified and relayed by other Bitcoin users. On average, a new block is appended to the blockchain every 10 minutes, through mining.

Based on the Bitcoin protocol, the blockchain database is shared by all nodes participating in a system. Upon joining the network, each connected computer receives a copy of the blockchain, which has records, and stands as proof of, every transaction ever executed.

Blockchain technology is one of the world’s leading platform for managing digital assets. Blockchain is considered to be one of the pioneers in moving the world to a cashless economy. Blockchain technology also maintains records of the bitcoin transactions that were made in a sequential manner. Get to know the nuances of blockchain and leverage it to your advantage for assured growth and improved efficiency. Some of the leading industry experts use Blockchain for creating various platforms to make banking more efficient. Not just banking, blockchain can be used across various platforms where transactions happen

Blockchain provides a way for two parties to make a secure transaction on a decentralized public network. It is an immutable, distributed and transparent ledger that cannot be altered in any way once it is written. This means that once the data is stored on the blockchain’s distributed peer-to-peer network, its authenticity cannot be brought into question and its value can be verified at any point throughout the network.

A blockchain, originally block chain, is a growing list of records, called blocks, which are linked using cryptography. Each block contains a cryptographic hash of the previous block, a timestamp, and transaction data. Hence, by design a blockchain is resistant to modification of the data.

The Blockchain use case has ramifications far beyond the financial sector not only recording transactions. However the technology first used for recording bit coin transactions and invented by Satoshi Nakamoto in 2008 to serve as the public transaction ledger.


One of the main reasons why DevOps is important is because it is not
limited to any technology or any specific area. Learning DevOps makes you
acquire knowledge across different aspects of software development such as
building, coding, testing, automating, releasing the software and maintenance
functions. Due to the inter-disciplinary facets, DevOps is currently the
favorite among many people in the software industry. Master the principles of
DevOps and upgrade your skills from software professional to a DevOps expert by
creating software solutions that are faster and efficient. Be a step ahead from
the software professionals by leading the overall process with the knowledge of

Jenkins is an open source automation tool written in Java with plugins built for Continuous Integration purpose. Jenkins is used to build and test your software projects continuously making it easier for developers to integrate changes to the project and making it easier for users to obtain a fresh build. It also allows you to continuously deliver your software by integrating with a large number of testing and deployment technologies.

With Jenkins, organizations can accelerate the software development process through automation. Jenkins integrates development life-cycle processes of all kinds, including build, document, test, package, stage, deploy, static analysis and much more.

Jenkins achieves Continuous Integration with the help of plugins. Plugins allows the integration of Various DevOps stages. If you want to integrate a particular tool, you need to install the plugins for that tool. For example: Git, Maven 2 project, Amazon EC2, HTML publisher etc.

Data Science

Data science, also known as data-driven science, is an interdisciplinary field of scientific methods, processes, algorithms and systems to extract knowledge or insights from data in various forms, either structured or unstructured, similar to data mining.

Data science is a “concept to unify statistics, data analysis, machine learning and their related methods” in order to “understand and analyze actual phenomena” with data. It employs techniques and theories drawn from many fields within the broad areas of mathematics, statistics, information science, and computer science, in particular from the subdomains of machine learning, classification, cluster analysis, uncertainty quantification, computational science, data mining, databases, and visualization.

The Data Science Certification with R has been designed to give you in-depth knowledge of the various data analytics techniques that can be performed using R. The data science course is packed with real-life projects and case studies, and includes R CloudLab for practice.

Data Science is a vast technology that encompasses various aspects in
many fields. Data Science also forms the basis for working with big data and
analytics also. By creating a clear understanding in data science, one can
discover many opportunities as more and more businesses are becoming data
driven. Data science course helps you learn how you can analyze data using
automated methods, collating data from different devices using sophisticated
techniques. Data science can be applicable in many areas such as predictive and
prescriptive analysis, machine learning etc. This data can be used for making
critical business decisions that will have a larger impact.