XenonStack

A Stack Innovator

Post Top Ad

Monday 6 January 2020

1/06/2020 05:31:00 pm

IoT Platform and Internet of Things Applications

Scalable, Secure IoT Platform on Google Cloud Platform

What is the Internet of Things Platform?

Internet of Things (IoT) connected using the Internet capable of sending the data to the Cloud. The stuff in IoT equipped with different sensors that have their role to interact with the physical object and collect information from that.
IoT Platform

How IoT Platform Works?

The things in IoT equipped with different sensors which role to interact with the physical object and collect information from that. A single device may have fitted with multiple sensors.
IoT devices are configured to push the sensor data to the Cloud where anybody can perform the required transformation and analytics.

Internet of Things Platform Benefits

  • IoT devices can be used to implement a Smart Home.
  • IoT devices used for monitoring things in the industrial sector.
  • IoT can be used to collect data from different devices and generating alerts in case of any threshold.
  • Intelligence Industries.
  • In healthcare, IoT can be used to monitor patients remotely and do the needful treatment from the remote location.

Why Adopting the Internet of Things Matters?

The growth in the use of IoT devices is changing the lifestyle, personal health, habits, environment, and industries across different sectors. IoT is enabling to capture the physical data and applying separate analytics on the bases of the collected data. IoT has made life very easy in various fields such as Healthcare, Industrial Sector, Home Automation, Environmental Monitoring, and Retail.

How to Adopt the Internet of Things (IoT)?

IoT plays a different role in each sector; the only common thing is that it enables to connect of the physical devices to the Cloud.
Healthcare System – In Healthcare, IoT can be used to monitor a patient’s heart rate or blood pressure. It can generate an alert to notify the doctor so that the doctor can guide to take immediate decisions. In hospitals, it can also help in Assets Management and configuring or tuning of any device from a remote location.
Industrial – Industrial IoT can be used to monitor the assets in the industrial sector. It can help in implementing Predictive Maintenance to minimize the loss in case of any failure. IoT could help to trigger the alert if any of the machine parts serviced in Real-Time which can be replaced and prevent the massive loss.
Home Automation – There are lots of IoT devices available which used in Home Automation/Security. IoT used for smart monitoring of the home appliances which includes door sensors, cameras, smoke or fire detector, smart electric devices.
Environmental Monitoring – When it comes to monitoring the environment, IoT devices can be used to collect the data in Real-Time from the sensors and that data can be used for forecasting.
Retail – In the Retail sector, IoT can help the retailers to upgrade the stores which can provide a better experience for the customers and also helps in theft prevention and analytics on the based of sales.

Best Practices for Implementing IoT Cloud Platform

The most important thing while deploying an IoT solution is to take care of the security. When everything exposed to the internet, there may be a possibility of a breach into the system which can help the hackers to misuse the devices.
Network Security – First of all, the person/organization that is implementing IoT needs to secure their network on which all the IoT devices connected. Take an example of home automation: when someone has to implement home automation, he should keep the network secret, and he must use proper encryption technic if using a Wi-Fi network and must use appropriate authentication and access control to access the network.
Device Security – Other than the network, device-level security is needed which can prevent unauthorized access to the devices. Select the device manufacturer wisely which provides security patches after an interval.
Data Security – After the network and device everyone must secure the database or data warehouse where they are storing their data, collected from IoT devices.

Top IoT Cloud Platform and Tools

Ingestion Tools

Apache MiNiFi – Apache MiNiFi is the sub-project of Apache NiFi which can be used to collect data from IoT devices. Apache MiNiFi is of very size and capability to run on small devices that consume meager resources.
StreamSets Data Collector Edge – SDC edge is a very lightweight agent that is capable of collecting the data from its source of creation.

IoT Cloud Platform Solutions

AWS IoT Core is a managed Cloud platform by AWS which enables the secure connection between different connected devices(sensor) and multiple Cloud services. AWS IoT Core enables to handle billions of devices and trillion of messages. It also helps to route those messages to different AWS Cloud services and allows the building of analytics and machine learning models.
Cloud IoT Core is the managed service of Google which allows as to quickly and securely connect, control and ingest data from millions of IoT devices. It also enables to communicate with other Google Cloud Service for collecting, processing, analyzing and visualizing the data. Cloud IoT Core is a serverless service that automatically scales in response to Real-Time events. It supports both HTTP and MQTT protocols of the communication with the IoT devices.
In Azure, IoT Hub acts as the central unit which allows connecting, monitor and manages millions of devices using bi-directional messaging. It supports AMQP, MQTT, HTTP protocols for the communication with the IoT devices. It also helps there own protocol gateway Azure IoT Protocol Gateway in case a device doesn’t support AMQP, MQTT, HTTP.

Concluding the Internet of Things Platform and Use Cases

The Internet of things is a trending keyword in the field of technology. The Internet of Things is actually a pretty simple concept, it means taking all the things in the world and connecting them to the internet. Hence, in the fast-moving pace of digitization, enterprises are investing a lot of time and effort to dive into this wave ranging from industry to industry. For industry-specific case studies and use cases of IoT, click below to know more:

 Read more: IoT Platform at XenoStack.com/insights

1/06/2020 05:28:00 pm

Overview of ONAP Architecture and Best Practices

benefits of ONAP

What is ONAP?

It’s a project under the governance of the Linux Foundation and founded by AT&T and China Mobile. ONAP stands for Open Network Automation Platform (ONAP) is an initiative created by the combination of the ECOMP and Open-O projects into ONAP, to bring the capabilities for designing, creating, orchestrating and handling of the full lifecycle management of VNF (Virtual Network Functions) or Network functions virtualization (NFV), SDN (Software Defined Networks), and the services that all of these things require.
Note – The primary goal of ONAP is to implement the capabilities needed for orchestration and handling of the full lifecycle management of VNF deployments.
ONAP is the platform that works above the infrastructure layer to automate the network. ONAP allows end-users to connect products and services through the infrastructure. It allows deployments of VNFs and scaling of the network, in a fully automated manner. The high-level architecture of ONAP consists of different software subsystems that broadly divided into a design-time environment, and execution time environment to execute what the designed platforms.
ONAP community defines blueprints for various use cases during each release, which can be adopted by the users immediately. Some essential uses cases are –
  • 5G
  • CCVPN
  • VoLTE
  • vCPE
ONAP will bring the next revolution in the field of networking. ONAP stands for an open network automation platform and manages the virtually defined networks. What is the VNF or what’s the big deal about this Virtual Networking?
VNF – Network functions virtualization is a network architecture concept that uses the technologies of IT virtualization to virtualize entire classes of network node functions into building blocks that may connect, or chain together, to create communication services. – Wikipedia.

Why VNF or SDN required?

There are lots of reasons (like vendor issues, complex control panel) to move to virtualization or software-defined architecture.
All the hardware network devices have data planes (describe where data forwarded done through network addressing) and control plane the complex one(its work as a decision-maker and control where should traffic be sent and how quickly).
Control panel is not as simple as network architecture have multiple devices type then have multiple control plane one for each, which results in multiple decision-makers in your n/w which become very complex even at typical network configuration that has a router paired with a firewall device plus a WAN acceleration device.
To solve these issues and reduce the complexity there comes software-defined or virtual networking, that abstracts the data plane and control plane. The NVF converts the single hardware task to the virtual machine or software-defined which does the same work done by hardware devices but in more Agile and adaptive ways. It’s a software application used in Network Functions Virtualization (NFV) that has defined interfaces and provides well-defined networking functions components; the components can be one or more, for example, a security VNF has a function related to NAT and Firewall.
But VNF too has various challenges like Vendor Compatibility and many others, it enables VNF ( Virtual Network Functions), and other network functions and services easily understandable in an automated, policy-driven Real-Time environment. This provides everyone the ability to fully create, design and deploy for Automated Network Services.

How ONAP Works?

ONAP is the result of many software subsystems combined, these subsystems broadly divided into two major architectural framework parts –
Design-time framework – It defines, designs and programs the platform. Again design-time framework consists of the following subsystems –
Service Design and Creation (SDC) – It defines, simulates, and certifies assets and their associated processes and policies.
Policy – It enables the creation and deployment of rules to instantiate conditions, requirements, constraints, attributes, or needs regarding the assets provisioned, maintained, or enforced.
Run-time framework – To execute the programmed logic defined in the design phase. It also consists of following subsystems –
  • Active and Available Inventory (AAI)
  • Controllers
  • Dashboard
  • Data Collection, Analytics, and Events (DCAE)
  • Master Service Orchestrator (MSO)
  • ONAP Optimization Framework (OOF)
  • Security Framework

Read more: What is ONAP at XenonStack.com/Insights

Friday 27 December 2019

12/27/2019 06:06:00 pm

Visual Analytics Services for Data-Driven Decision Making


Introduction to Visual Analytics

Visual analytics is the process of collecting, examining complex and large data sets (structured or unstructured) to get useful information to draw conclusions about the datasets and visualize the data or information in the form of interactive visual interfaces and graphical manner.
Data analytics is usually accomplished by extracting or collecting data from different data sources in the form of numbers, statistics and overall activity of any organization, with different deep learning and analytics tools, which is then processed using data visualization software and presented in the form of graphical charts, figures, and bars.
In today's technology world, data are reproduced at incredible rates and amounts. Visual Analytics helps the world to make the vast and complex amount of data useful and readable. Visual Analytics is the process to collect and store the data at a faster rate than analyze the data and make it helpful.
As the human brain process visual content better than it processes plain text. So using advanced visual interfaces, humans may directly interact with the data analysis capabilities of today’s computers and allow them to make well-informed decisions in complex situations.
It allows you to create beautiful, interactive dashboards or reports that are immediately available on the web or a mobile device. The tool has a Data Explorer that makes it easy for the novice analyst to create forecasts, decision trees, or other fancy statistical methods.

Visual Analytics Process

Like any other process, Visual analytics also follows a procedure with the feedback loop, which starts from collecting, scrapping raw data and examining, analysis data from different data sources with human interaction to get the information from knowledge from data.
The following figure shows an overview of different stages of the Visual Analytics Process.

Collecting and Scraping raw Data from a different source

Store Data and do initial analytics

Visualize the Data

Exploration and Analysis of the Data

Role of Visual Analytics and Data Visualization in the Financial Sector

Many people are confused by the new term visual analytics, and data visualization and do not see differences, While there is some difference and play a different role in different sectors, such as :
Data visualization is nothing but to represent the data in the form of a pictorial, graphical and interactive manner. Data analytics is also a process of enquiring, examining, decision-making, combining visualization, human factors and analyze hidden data sets and derive meaning data.
Data analysts turn complex data set into readable plain text (English), whether its sales figures, market research or stocks, logistics, or transportation costs, and social media states.
Weather Data Visualization engineer turn that readable plain text in the form of charts, graphs, and design elements, that help business explain trends and stats more easily.
Data Visualization & Data Analytics are related to each other in the form of the sector they used; both are related to the same industry like Finance, Banking, Healthcare, Retail, Crime detection, Daily trends Analysis, etc.
Data Analytics helps the above sectors to Identify the current market trends, future forecast and to analysis the monthly, early growth of a business and support business to be proactive for the future growth using the data visualization tools to represent the interpreted data in the readable format.
Computers made it possible to make use of data analytics tools to process complex data at high speeds.
Read More: XenonStack/Blogs

Thursday 26 December 2019

12/26/2019 05:34:00 pm

Predictive Healthcare Analytics Platform


Predictive Healthcare Analytics Platform

Predictive Healthcare Analytics Platform and solutions is a part of advanced analytics that is used to predict future events. Predictive analytics practices many ways from data mining, statistics, modeling, machine learning, and artificial intelligence to examine current data to make forecasts about the future.
  • Good healthcare boosts the economy of the nation. Precision medicine along with Big Data is leveraging in building better patient profiles as well as predictive models to diagnose and treat diseases.
  • TeleMedicine and AI in healthcare is indeed a miracle remotely performing treatment of patients using Pattern Recognition, optimizing duty allocation, monitoring live data.
  • Real-Time Big Data for Infection Control to predict and prevent infections through networks creating safer environments.
  • Patient Data Analytics for a patient dealing and preventing readmissions and better pharmaceutical supply chain management and delivery.

Challenges for Building Predictive Analytics Platform

  • Interface for the patient to search nearby doctors by particular Healthcare categories.
  • Enable patient visibility to see doctor’s availability online and communicate via text chat, audio or video call.
  • Visible allotment number to the patient in the waiting queue.
  • Communicate with the doctor as well as test or medicine suggestions to the patient.
  • Interface for the patient to contact nearby labs to collect a sample and upload test reports on the server followed by the push notification when the report is ready.
  • Share report with a doctor followed by a prescription to the patient.
  • Search for nearby medical stores and place an order for the prescription got from the doctor.

Solution Offerings for Real-Time Monitoring

Develop a Healthcare platform to fully automate using the latest technologies and distributed Agile development methods.

Real-Time Monitoring of User’s Events

Apache Kafka & Spark Streaming to achieve high concurrency, set up low latency messaging platform Apache Kafka to receive Real-Time user requests from REST APIs (acting as Kafka producer).
Apache Spark Streaming (processing and Computing engine) Spark-Cassandra connector, stored 1 million events per second in Cassandra. Built Analytics Data Pipeline using Kafka and Spark Streaming to capture user’s clicks, cookies, and other data to know users better.
Microservices using Spring Cloud, NetFlix OSS, Consul, Docker, and Kubernetes
Develop REST API’s using Microservices architecture with Spring Cloud and Spring Boot Framework using Java language. Moreover, use Async support of the Spring framework to create Async controllers that make REST API easily scalable.
Spring to deploy REST and use Kubernetes for secure containers and their management. For API gateway, use NetFlix Eureka Server which acts as a proxy for REST API and the lot of Microservices, Consul as DNS enables auto-discovery of Microservices.
12/26/2019 05:29:00 pm

Solutions for Building IoT based Smart Meters




Introduction to Smart Meters

In the Energy Upgrade solution, IoT is playing a significant role. The use of smart meters is increasing, which enables the intelligent and efficient use of energy at homes and businesses. Many grid power supply companies, small and large industries, private residential sector are also implementing the smart solution for energy efficiency and sustainability.

Business Challenge for Building the Analytics Platform

We need to build a complete analytical solution that can be used for the energy-saving recommendation based on the usages for the large buildings and industries. Also, the challenges were to filter the results found on floors, buildings, heat, water, electric. Along with the dashboard, alerting for usage also should be used based on usage.

Solution Approach for Building IoT based Smart Meters

Complete Smart meter based analytical dashboards which include -
  • Recommendation for energy saving
  • Predictive results for Energy Bills
  • Real-time alerting on some defined alerting rules
  • Analytical results on the base of historical data

Tuesday 24 December 2019

12/24/2019 04:58:00 pm

Advanced Threat Analytics and Intelligence 


Overview of Advanced Threat Analytics and Intelligence

The security aspect has changed dramatically over recent years. The cyber-attacks nowadays have become more pervasive, persistent, and proficient than ever at escaping and contaminating traditional security architecture. Cyber threats have become more complex and complicated. Many companies meet stealthy attacks in their systems. These attacks are targeted towards intellectual property and consumer information theft or encryption of important data for ransom. Therefore, to protect your IT assets, you must know what is coming, secure your digital interactions, detect, and manage inevitable breaches, and safeguard business chain and regulative compliance.
Threat Detection is the art of identifying attacks on a computer. While there are a large variety of Cyber Security attacks, most of them fit into one of four categories -
  • Probe
  • Denial of Service (DoS)
  • User to Root
  • Remote to User
Hence, companies are looking for Cyber Security Services and Solutions to ensure the security of their IT network. In this use case, we will guide you through how we built an effective cybersecurity and threat detection system using machine learning.

Apache Metron Overview

Apache Metron is a cybersecurity application framework that provides the ability to ingest, process and store various security data feeds at a scale level to detect cyber anomalies and enable organizations to take action against them rapidly.

Apache Spot Architecture for Cyber Security

Apache Spot is a cybersecurity project, aimed to bring Advanced Analytics to all IT Telemetry data on an open, scalable platform. Apache Spot expedites the threat detection, investigation, and remediation via machine learning and consolidates all enterprise security data into a comprehensive IT telemetry hub based on open data models.

Threat Detection Using Deep Learning

A multi-layered Deep Learning-based system is very robust, scalable and adaptable. All the identified incidents & patterns are denoted by a risk score, to help investigate the breach, control data loss and take precautionary actions for the future.

Threat Detection Using Machine Learning

A Machine Learning-based Threat Detection system automates the process of extracting insights from file samples through better generalization at identifying unknown variations. It also helps in reducing human analysis time.

Challenges to Real-Time Cyber Threat Intelligence

  • To perform Real-Time Threat Intelligence on trillions of messages per year.
  • Storing and Processing the unstructured security data.
  • Combine Machine Learning and Predictive Analytics to perform Real-Time Threat Analytics.

Solution Offerings for Threat Detection and Cyber Security

Threat Analytics and Intelligence by automating the process of Threat Detection and Analysis. Following steps are performed to Automate the process -
  • Network Dataset
  • Pre-Processing of Data
  • Feature Extraction
  • Reduce Data Amount
  • Improve Accuracy
  • Avoid Overfitting

Training and Testing of Data Using Classification Models

  • Decision Tree
  • Random Forest
  • Naive Bayes
  • KNN
  • Result Analysis

12/24/2019 04:55:00 pm

Cloud Data Migration from On-Premises to Cloud 


Best Practices of Hadoop Infrastructure Migration

Migration involves the movement of data, business applications from an organizational infrastructure to Cloud for -
  • Recovery
  • Create Backups
  • Store chunks of data
  • High security
  • Reliability
  • Fault Tolerance

Challenge of Building On-Premises Hadoop Infrastructure

  • Limitation of tools
  • Latency issue
  • Architecture Modifications before migration
  • Lack of Skilled Professional
  • Integration
  • Cost
  • Loss of transparency

Service Offerings for Building Data Pipeline and Migration Platform

Understand requirements involving data sources, data pipelines, etc. for the migration of the Platform from On-Premises to Google Cloud Platform.
  • Data Collection Services on Google Compute Engines. Migrate all Data Collection Services and REST API and other background services to Google Compute Engine (VM’s).
  • Update the Data Collection Jobs to write data on Google Buckets. Develop Data Collections Jobs in Node.js and write data to Ceph Object Storage. Use Ceph as Data Lake. Update existing code to write the data to Google Buckets hence use Google Buckets as Data Lake.
  • Use Apache Airflow to build Data Pipelines and Building Data Warehouse using Hive and Spark. Develop a set of Spark Jobs which runs every 3 hours and checks for new files in Data Lake ( Google Buckets ) and then run the transformations and store the data into Hive Data Warehouse.
  • Migrate Airflow Data Pipelines to Google Compute Engines and Hive on HDFS using Cloud DataProc Cluster for Spark and Hadoop. Migrate REST API to Google Compute Instances.
  • The REST API served as Prediction results to Dashboards and acts as Data Access Layer for Data Scientists migrated to Google Compute Instances (VM’s ).

Technology Stack -

  • Node Js based Data Collection Services (on Google Compute Engines)
  • Google Cloud Storage found Data Lake (storing raw data coming from Data Collection Service)
  • Apache Airflow (Configuration & Scheduling of Data Pipeline which runs Spark Transformation Jobs)
  • Apache Spark on Cloud DataProc (Transforming Raw Data to Structured Data)
  • Hive Data Warehouse on Cloud DataProc
  • Play Framework in Scala Language (REST API)
  • Python-based SDKs

Monday 23 December 2019

12/23/2019 05:41:00 pm

AI-powered Customer Experience Services and Solutions


Role of AI in Customer Experience

With the increase in computation power and decreasing prices of storage devices leads to a digital transformation to apply AI in solving business problems. Some specialists are calling it the fourth industrial revolution. AI is all about making computers think like humans with customer interaction solutions. Using domain expertise of humans, we are feeding the features to the system to create AI to solve problems of domains like healthcare, stocks, Computer Vision (CV), Natural Language Processing (NLP), Retail, Entertainment, etc. Due to these applications, have better solutions for problems like cancer prediction which is a way better than trained medical experts, stock market prediction by analyzing the various traits such as sentiments of people, detecting unusual activities in video, etc. The entire credit goes to the active community of researchers that made solving such problems with greater accuracy.
AI can also be used for customer interaction, like humans are interactive and intuitive researchers are making systems that are as interactive and intelligent as humans but are resistant to tiring and boredom. The driving force behind it is electricity and a network connection. According to Gartner by 2020, 85% of customer interaction will be managed without a human.

Causes of the popularity of AI in customer experience and interaction

  • No human intervention in services
  • Time-efficient
  • Cost-efficient
  • Better data crunching
  • Hungry for data
  • Improves routing of tickets
No human intervention in services — A piece of software deals with the customer for handling its queries. Whatever you ask it can give you better answers or suggestions without saying pardon
Time-efficient — Can handle queries in no time. Without the need to think like a human before answering.
Cost-efficient — A single AI bot can handle communications of many channels at once. Hence, it leads to saving the cost of hiring.
Better data crunching — When tackling with the wide variety of data collected from different sources such as feedback, surveys, customer requirements, etc. humans might get into a state of confusion which to And what to deal first. Using the fusion of AI and machine learning it quantifies the insights collected from the data and ultimately leads to better strategic decisions.
Hungry for data — Results in better performance when we feed to more (variety) data. Let’s take an example of an AI chatbot deployed at some support site. If we keep on asking the short queries, it gives a good result but not as efficient as per the expectations. Great questions will improve its performance as the bot tries to find the intent from the query.
Improves routing of tickets — For a customer-centric organization, it is necessary to improve the ticket’s path. Consider the example of Uber there can be issues like refund status, driver not arrived, etc. So the system must route the tickets by understanding the intent of the problem to
the respective customer care executive so that customer doesn’t need to wait for more.

Some facts and figures related to Artificial Intelligence

  • Robots and AI will replace 7% of jobs in the US by 2025. But an equivalent of 9% of the posts will be created.
  • According to Forrester, 8.9 million new jobs to be created by 2025 which leads to an increase in demand for robot monitoring professionals, data scientists, automation specialists.
  • By 2020 52% of consumers will expect that the company should provide service via virtual reality.
  • 73% of Companies will shift their AI product to the cloud.
  • 58% of consumers want the product can self-diagnose issues and automatically troubleshoot itself.
  • According to Oracle, nearly 8 out of 10 businesses have adopted AI or are planning to take by 2020.
The secret sauce for a successful business
The organization must focus on customer involvement and engagement. With the introduction of AI in customer interaction, people started enjoying the services, increase in time of engaging, gains user trust and improves the brand value.

Building blocks for AI in Customer Experience and Interaction

  • Data Unification
  • Real-time insights delivery
  • Business Context
Data Unification — In an organization, data comes from many sources i.e. records, real-time data and curated data from the team. The problem is how to merge and match these sets of data to get the best out of it. Companies like General Electric (GE) have spent several years in data unification. The most important and tedious phase for data engineering teams to prepare ready for modeling.
Real-time insights delivery — It’s all about analyzing the interest of the customer. Organizations like Amazon, Flipkart has made their recommendation algorithms that recommend product promptly and moreover can increase the buying capacity of the customer (upsell). According to Harvard Business Review, 60% of business leaders, say customer analytics is critical by 2020, it will be increased by 79%.
Business Context — When applying AI in the organization, we first need to understand the perspective of the business. What is our target customer and how we handle the ambiguity in the conversation between customer and support?
 Continue Reading: XenonStack/Blogs