Anomaly and Intrusion Detection in IoT Networks with Enterprise Scale Endpoint Communication

This is part one of a series of articles to be published on LinkedIn based on a classroom project for ISM 647: Cognitive Computing and Artificial Intelligence Applications taught by Dr. Hamid R. Nemati at the University of North Carolina at Greensboro Bryan School of Business and Economics.

The Internet of Things (IoT) continues to be one of the most innovative and exciting areas of technology in the last decade. IoT are a collection of devices that reside in the world that collect data from the environment around it or through mechanical, electrical, thermodynamic or hydrological processes. These environments could be the human body, geological areas, the atmosphere, etc. The networking of IoT devices has been more prevalent in the many industries for years including the gas, oil and utilities industry. As companies create demand for higher sample read rates of data from sensors, meters and other IoT devices and bad actors from foreign and domestic sources have become more prevalent and brazen, these networks have become vulnerable to security threats due to their increasing ubiquity and evolving role in industry. In addition to this, these networks are also prone to read rate fluctuations that can produce false positives for anomaly and intrusion detection systems when you have enterprise scale deployment of devices that are sending TCP/IP transmissions of data upstream to central office locations. This paper focuses on developing an application for anomaly detection using cognitive computing and artificial Intelligence as a way to get better anomaly and intrusion detection in enterprise scale IoT applications.

This project is to use the capabilities of automating machine learning to develop a cognitive application that addresses possible security threats in high volume IoT networks such as utilities, smart city, manufacturing networks. These are networks that have high communication read success rates with hundreds of thousands to millions of IoT sensors; however, they still may have issues such as:

  1. Noncommunication or missing/gap communication.
  2. Maintenance Work Orders
  3. Alarm Events (Tamper/Power outages)

In large scale IoT networks, such interruptions are normal to business operations. Certainly, noncommunication is typically experienced because devices fail, or get swapped out due to a legitimate work order. Weather events and people, can also cause issues with the endpoint device itself, as power outages can cause connected routers to fail, and tampering with a device, such as people trying to do a hardwire by-pass or removing a meter.

The scope of this project is to build machine learning models that address IP specific attacks on the IoT network such as DDoS from within and external to the networking infrastructure. These particular models should be intelligent enough to predict network attacks (true positive) versus communication issues (true negative). Network communication typical for such an IoT network include:

  1. Short range: Wi-Fi, Zigbee, Bluetooth, Z-ware, NFC.
  2. Long range: 2G, 3G, 4G, LTE, 5G.
  3. Protocols: IPv4/IPv6, SLIP, uIP, RLP, TCP/UDP.

Eventually, as such machine learning and deep learning models expand, these types of communications will also be monitored.

Scope of Project

This project will focus on complex IoT systems typical in multi-tier architectures within corporations. As part of the research into the analytical properties of IT systems, this project will focus primarily on the characteristics of operations that begin with the collection of data through transactions or data sensing, and end with storage in data warehouses, repositories, billing, auditing and other systems of record. Examples include:

  1. Building a simulator application in Cisco Packet Tracer for a mock IoT network.
  2. Creating a Machine Learning anomaly detection model in Azure.
  3. Generating and collecting simulated and actual TCP/IP network traffic data from open data repositories in order to train and score the team machine learning model.

Other characteristics of the IT systems that will be researched as part of this project, include systems that preform the following:

  1. Collect, store, aggregate and transport large data sets
  2. Require application integration, such as web services, remote API calls, etc.
  3. Are beyond a single stack solution.

Next: Business Use Cases and IoT security

Derek MooreErica Davis, and Hank Galbraith, authors.

Big Data as the Next Major Utility: Musings on the Future of Autonomous Vehicles and CASE.

“Big Data” is everywhere.  It powers business solutions as well as drives economic opportunity.  Is it possible that “Big Data” will become the next major utility?  By utility, I don’t mean its usefulness to businesses.  Can data be a utility like electricity, gas or water which is distributed reliably through major cities for customer demand?  With the Smart City initiatives, that certainly appears to becoming more and more a reality, but smart cities programs do not necessarily build the B2C model that major utilities do.  Autonomous vehicles (AV) and Machine Learning (ML) may fill the gap that makes “Big Data” a utility.  One possible business model includes customers who pay for how much data they use and the times they use it.  Since AV technology will have data from internal and external sensors to evaluate road conditions and anomalies, the utility business model may come into play as a way to pay for such computation and classification.   Machine learning algorithms will help create reinforcement of anomaly and object detection scenarios for AV.

Currently, cars on the market have Advanced Driver Assistance Systems (ADAS) development and includes driver assist technology such as accident avoidance sensors, drowsiness warnings, pedestrian detection, and lane departure warnings. Today’s driver-less cars are actually vehicles that are retrofitted with components that allow drivers to remove their hands from the steering wheel.  To have fully autonomous vehicles, there must be a supply of historical and near real-time data to train ML models that will guide future AV.  Like the generation of electrical power from a turbine, there has to be a supply and distribution approach to ML systems that is continuously providing reinforcement learning to AV.  The generation of AV data must be ongoing every hour of the day for years in order to continuously train the ML models to build reliability in future AV algorithms and models.

The future on Autonomous Vehicles

CASE stands for Connected, Autonomous, Shared, Electrification (Vehicles).  In many regards, its the evolution of modern transportation: A vehicle that doesn’t need a human operator, but transports people or goods to different destinations effectively, safely and efficiently with little or no impact on the environment.  But not only will this vehicle be able to transport, but it will serve as a data collector and generator that could be used to determine road conditions, connect with businesses and establish business to customer or customer to business relationships.

The development of AV must be based on electrification (electric vehicles).  Direct digital control and feedback systems of electrical consumption is ideal for clean and efficient generation of power.  The autonomous capabilities of vehicles would not only control direction and speed but also the granularity of electrical consumption needed by the AV that would be imperceptible to a live human operator.  Metrics could then be displayed to the passenger, owner or the manufacturer of the AV as feedback of its efficiency.

The main focus of the future generation of fully autonomous vehicles will be the ability keep a driver safe and successfully navigate any condition or obstacle as the AV transports its passengers to their destination…from leaving their home to getting into the vehicle, to walking into the destination.   Services will be available to businesses that will allow AVs to follow exact directions where the business is and have approved parking spaces that the vehicle will navigate to.  Most interfacing will be conducted through the passenger(s) smart phone(s).

Here is an example.  David picks up his smart phone and clicks on an app to request reservations at a restaurant for his wedding anniversary.  The service request is paired with an AV smart phone application that also sends the request to the cloud and the restaurant reservation API.  The ML system in the cloud then programs the AV to navigate to the restaurant as well as park in a designated parking space (no valet needed).  When the dinner is complete, David clicks on the app to pick up him and his wife and return home.

Future autonomous vehicles will not have manual overrides or speed up to make it to that movie on time.

In order for autonomous vehicles to build trust within the driving community, it must maintain consistent patterns and make decisions that ensure the safety and comfort of all its passengers.  What you don’t want is the AV to immediately speed up to make a light or make sharp or quick turns to avoid oncoming traffic.  This mean the automobile needs to have AI and machine learning capabilities that obeys all traffic laws and makes correct predictions on any anomaly or object.  Future AV and CASE will not have steering wheels or brake pedals because that represents a manual override which in turn erodes trust with the occupants.

The future generations of AV should not have steering wheels.  Most modern cars rely on a steering system that includes a “rack and pinion” assembly by which a live operator (driver) can turn the car right or left when needed.  Removing the steering mechanism will allow for passenger only occupancy and create a system that is principally controlled by computerized systems instead of mechanisms that require human intervention.  In the event that the vehicle requires override control by an operator, that operator will be in a vehicle control and command operations center (VOC).  The center will be maned by trained commercial drivers.  Such command operation centers would be third-party, provided by the manufacturer of the vehicle, or by a municipality.

Future autonomous vehicles will be fully connected mobile platforms.

Think of a smart phone and everything that it does.  Now, imagine an autonomous vehicle as essentially a large smart phone that can transport passengers who are connected to  what’s happening outside the car.  These riders will expect to map the course to their destination through connected devices, data, cloud computing and sensors that will then be shared with businesses and users before, during and after they reach their destination.  The applications for such connectivity are tremendous.

The impact of Big Data on autonomous vehicles.

As 5G wireless networks come online, smart cities and autonomous vehicles will fully utilize data to the cloud and back.  5G will facilitate unprecedented communication speed from the vehicle to the outside world allowing sensing and tracking of nearly 5,000 GBs of data per vehicle per day, making vehicles more efficient and safe.  New computer processor architectures will test, train and build Machine Learning and Deep Learning models faster than in the past and help train AVs to become better equipped to conditions in cities and on highways.

Maintaining a competitive advantage has become a important business strategy.

One of the things I love about data science and data analytics is that most of the innovation done in this area has been shared in open data and open source communities.  Internet sites like Kaggle, Amazon and Google have offered public data to anyone wanting to perform Machine Learning, Predictive Analysis and Deep Learning (see my review of DataSciCon.Tech).  Open Source software and platforms has grown quickly as well.

This is not the case for vendors invested in the future of AV.  The data collected from sensors and IoT devices in the vehicle as well as in big data cloud systems are a well guarded secret.  Development SDKs for AV technology is accessible only to clients of these AV manufacturers and their partners.  What this will mean to the future for AV innovation is still up for debate; However, companies certainly have the right to safeguard their proprietary research in this area.  It’s not completely known what impact this strategy will have on long-term adoption of AV.

 

What Companies Need to Know About Big Data and Social Computing in Information Technology Management

Internet statistics estimate that 500 million tweets are produced per day. That translates to millions of conversations about a vast array of topics.  “Big data” is a term that has become more prominent as social media sites such as Twitter, Facebook, Instagram, etc. continue to generate large data streams.  Consumers produce click stream data and complete transactions visiting corporate websites to make purchases, schedule appointments for services or typing reviews on Yelp, Amazon and Uber about an experience that they’ve had.  With a well-planned IS strategy, this data  can be analyzed to gain insight into their customers and make critical strategic decisions necessary to compete.  Here are a few things companies should know about “Big Data” and social media computing as a business strategy.

Understand that social media and social networking is more a concept than a platform.

One of the  biggest problems with companies adopting social media as part of their IT business strategy is that the concept of social media for many IT managers does not extend beyond Twitter and Facebook.  There are many platforms for which social media is beneficial to business.  Slack and Github build on crowd-sourcing by emulating project management, software development and agile methodologies; even though those platforms are not primarily used for social media.

As more engineering firms adopt open source solutions, agile and DevOps development companies are deciding to use code development repositories such as GitHub.  Microsoft has already adopted GitHub as part of its Visual Studio Team Foundation options for source control.  The power of GitHub is very evident as global communities of developers use it to make some of the most innovative software products in languages such as Python, Java, C#, Ruby, etc.  It’s has also become a viable social media platform for software engineers who frequently collaborate on sprints.  Companies are also turning to solutions such as Slack to build entire global teams of developers to collaborate of on projects and sprints.

Social media as an IT business strategy is about understanding its contextual design and how the user interacts with it.  Part of understanding the contextual design of social media includes identifying the actors (primary and secondary) for which the platform are based and how those users interact with it to build relationships and communities.

Context also extends to how a user interfaces with social media.  Take, for example, the device many currently have in their pockets.  Apply classifications of contextual scope to this device and determine all the ways users interact through a platform (tablet, smartphone, computer, etc).  

A method known as the 4-I’s framework¹ is a good model to understand the user interaction in the context of social media.  The method is typically utilized in classifying interactions with information systems as described above.  The 4-I’s include:

  • Inscriptive (inputs)
  • Informative (outputs)
  • Interactive (processing)
  • Isolated (stored data)

This framework is useful for looking at ways to interact as a user that can perform as well as the information exchanged within that platform.  Another method that is popular is the MVC model or Model-View-Controller model which is used in software analysis and engineering as an architectural platform for implementing user interfaces on computers through separation of layers of those systems.

Do not dismiss “Big Data” as a gimmick.

The term “Big Data” itself may seem oversold through marketing, but the production of large data sets is very real, very fast and very large – with new data set being produced every day through public and private portals.

Big data is described as data that has variety (video, text, images, unstructured and structured), volume (over a terabyte, scale of brand), velocity (constant production of data streams), and veracity (the data needs to be cleaned and managed) .

 Information has become more fluid and available to more people faster and easier. Although no company should drive business decisions by what happens on Twitter or Facebook (or on the Dow), the power of “Big Data” as a tool can help in  trending analysis, customer segmentation and insight into short to long term business decisions.  

With “Big Data” companies will be able to:

  • Respond more quickly to market by making faster decisions.
  • Make patterns more evident to make changes to processes and products.
  • Better realize innovations and products and services and bring those to market faster.
  • Build and manage new and current data streams.
  • Create a data analytics ecosystem.  Make analyzing and aggregating data a business process all employees to utilize.

For a “Big Data” strategy to be successful, companies must:

  • Create data lakes and systems where raw data can live prior to being transformed for the business intelligence and reporting.
  • Remove data silos where data exists but is only accessible to a few internal stakeholders.  
  • Create a data analytics ecosystem
  • Create hybrid cloud solutions and begin moving applications to the cloud.

Know what association and segmentation analysis are and how to use them to learn about your customers.

With data streams, most coming online every day, new analytical methods can be used to gain insight into what consumers need in products and services.  Two popular analytical methods include association analysis and segmentation analysis.  In my next blog, I will discuss how these methods give insights into customers to better predict how they shop and what campaign ads are more likely to be successful with consumers.

With the popularity of Map Reduce and Hadoop, the business world is seeing an increase in “Big Data” analytics based on click stream and social media data.  Large data sets which would have taken days to analyze can now be done in minutes.

Conclusion

As data has become more prominent within an organization, and the means of collecting because easier and more ubiquitous, new skills will be necessary in certain roles to take full advantage of this data to drive value.  The corporate culture will need to adhere more to a data culture, where there is a value quotient to it collecting, cleansing, aggregating and analyzing data sources and data repositories.  Business leaders must establish new models that take advantage of social media and big data assets.

Works Cited

  1. Pitt, Leyland; Berthon, Pierre; Robson, Karen.  Deciding When to Use Tablets for Business Applications.  MIS Quarterly Executive Volume 10 Number 3 September 2011.

IT Strategies and Data Analytics

In an extension to my first blog, I research quantitative analysis of enterprise IT functions to demonstrate how to create IT business value.  It has to be established that, with so much data being collected from IT systems, IT managers can use this type of pervasive data to their advantage.  Functionality such as maintaining health,  securing systems,  and properly sizing new systems all have an impact to IT budgets.

Data analytics promotes value in IT.  Strategies using data analytics aim to create incremental value that can build on itself.  One of the keys of strategic IT value is to adopt a holistic approach to technology value, ignoring gimmicks, gadgets and marketing and instead looking at innovation as a combination of people, information and technology.  This balanced business strategy involves taking ownership of IT assets. In order for businesses to understand the value of those assets, it is crucial for IT managers to communicate that value.  Data analysis is a part of that communication.  Although data analytics can provide great insight into business technology, it will not always be successful in that goal.  The mission of data analytics as an IT strategy is to experiment often and to not be fearful of failure.

IT strategy involves aligning overall business goals and technology investment.  The first priority is for IT resources, people and functions to be planned around the overall business organization goals.  In order for such alignment to take place, IT managers need to communicate their strategy in business terms.

In many companies, funding for strategic initiatives is allocated in stages so their potential value can be reassessed between those stages.  When executives introduce a new business plan to increase market share by 15 percent with a new technology, IT managers must also meet those goals by assessing the quality of the IT infrastructure.

Executives also must have confidence that the IT assets that they purchase are sound.  There must be mutual trust, visible business support, and IT staff who are part of the business problem-solving team.   All of these factors are needed to properly determine the business value of IT.

One of the principals of business technology innovation is to aim for joint ownership of technology initiatives.  The quality of the IT-business relationship is central to delivering quality IT solutions that scale and meet production requirements.  Imagine a scenario where IT wasn’t aware that a utility would bring 1,000,000 new meters online that read electrical data every hour within two years, but instead, only sized for the initial 5,000 meter deployment.  This type of scenario would directly result in an utility customer having to upgrade all of their hardware only a year after the full deployment.

Innovations have created new ways of automating analysis to give more visibility into IT infrastructure.  This data can be analyzed using trending and predictive analytics to determine how much growth is needed based on specific targets and parameters.

Ideally, business and IT strategies should complement and support each other.  In order to improve the IT “Value Proposition”, IT projects must stop being considered the responsibility of only IT.  The definition of value must be clearly designed and presented by IT, but there must be a greater understanding that business executives have to take leadership in making technology investments shape and align the business strategy.  IT strategy must always be closely linked with sound business strategy.

Not only should IT and business be aligned, they must also complement each other strongly in order to build the type of relationship essential to achieve business goals.  It is a mistake to consider technology projects solely the responsibility of IT or to make IT solely accountable.  Business and IT must be accountable to each other when implementing and executing IT projects.

When creating an IT Strategy that can align to business objectives, five themes should be addressed.  These include:

  • business improvement
  • business enabling
  • business opportunities
  • opportunity leverage
  • infrastructure.

Research has shown that companies that have a framework for making targeted investments in IT infrastructure will further their overall strategic development and direction.  When companies fail to make IT infrastructure investment strategic, struggle on how to justify or fund for it.  In order for IT expenditures to be justified, many companies have concentrated on determining the business value of specific IT project deliverables, because it allows projects that focus on specific business goals to be properly scoped to include IT expenditures.

How a company measures business performance can be an accumulation of metrics both on the business side and the IT side.  Undelivered IT investment remains a big problem for organizations.  Many CEOs and CIOs believe that their Return on Investment (ROI) expectations for IT investments have not been properly met.   Although IT measures can be qualitative, meaning that expertise and knowledge from IT managers and staff contribute to understanding current and future IT growth and capacity, there are also ways to measure value quantitatively to help in the decisions making.

Non-technical communication is critical to executives.  IT staff typically work across many organizational units and must be effective at translating technical requirements into business requirements and vice versa.  Communication has become mission critical in the IT business value proposition.  When deciding how to apply data analytics across the organizations, IT should work with business leaders by looking at the IT function areas that produce the most data for their organization.  These areas include:

  • business analysis
  • system analysis
  • data management
  • project management
  • architecture
  • application development
  • quality assurance and testing
  • infrastructure
  • application and system support
  • data center operations

IT strategies require full business integration.  When IT managers are proposing new strategies, an executive summary should be the most important part of the proposal, prototype, roadmap, technical architecture document, etc.

Along with IT system metrics, IT managers must also keep in mind business operational metrics which are metrics based more on labor and time.  IT managers need to factor both IT and operational metrics in reports to business stakeholders.  There are several ways of reporting IT strategies to the business. Key Performance Indicators (KPIs) are fundamental to business decisions and are used to correlate business performance such as the how often a transaction results in a customer satisfaction.  KPIs examples include:

  • Efficiency rates.
  • Customer satisfaction scores
  • Capacity rates
  • Incident reporting rate
  • Total penalties paid per incident

Balanced Scorecards are strategic initiatives that align business strategy to corporate vision and goals.  It’s typically not the responsibility of IT managers to build scorecards, but rather understand the corporate balanced scorecards when building IT strategies.

Dashboards are visual representations of success, risk, status and failure of business operations.  In a very high paced organization, they allow information to be quickly disseminated and assessed by stakeholders for business decision making.  Dashboards tend to have more quantitative analysis than other types of reporting styles.

IT Governance

In the area of governance, the International Standards Organization (ISO) certification 27002 addresses monitoring and information security incidents.  Many of the methods used in the collection of data about system health can complement the adherence to information system security. Monitors log user access and security events such as unauthorized access to information systems.  Keeping security audit logs synchronized with specific system activity logs can indicate coordinated attacks on the system or denial of service (DOS) attacks that are popular for web applications and application service provides.  Using data analytics can help determine if deviations in system performance are related to security events such as unauthorized access, security threats such as malware, or other security issues; or if there is an issue with a functional issue within the system itself.  The boundaries between security and system health are consistently breached with networking, services and databases where the integrity and size of user traffic can be impacted.  Any unauthorized access can impact the availability and integrity of an information systems.

DevOps and Agile Software Development

DevOps is a corporate culture that emphasizes collaboration between developers (typically software developers) and operational business units.  DevOps provide tools and automation that can create a better customer experience by addressing issues and product changes faster.  Information systems can assist this functional area by providing analytical techniques about the readiness of release product code in the software development life cycle.

The principles of DevOps is to develop and test against production-like systems, deploy reliable processes, monitoring and validate operational quality and to improve the customer feedback loop to turn issues around faster.  Part of the power of data analysis is the ability to assist in agile, continuous delivery of software.  Automated testing and feedback with data analytical methods can provide the most qualitative information for business.  Providing data analysis on performance analysis, error logging and customer feedback as dashboards and visualizations can help make software development life cycle visible to all business stakeholders. As a rule of thumb business leaders are not interested in code or complex spreadsheets.  They are much more interested in quality scores, key performance indicators (KPIs) and business metrics.

IT Budgets

IT budgets are addressed in two categories: operational costs and strategic investments.  Operation are “keep the lights on” cost that involve running IT like a utility. Operation cost include maintenance, computing, storage, network and support, to name a few examples.  Strategic investments is a balance of initiative spending and coordination with organizational strategic objectives.  Strategic investment becomes more efficient from the corporate to department level.

IT budgets are also about reducing costs.  Many organizations have legacy systems that are not used efficiently and have requirements that create problems for strategic investments in new innovations.  Having an application portfolio is a good way of understanding the risks versus benefits of maintaining legacy systems.  Creating a data integration strategy as part of a data analysis ecosystem allows businesses to fully utilize all of their assets.  Most of these systems contain metadata that has long since been de-supported.  Part of the power of data analysis services such as online analytical processing (OLAP), business intelligence (BI) and master data management (MDM) is the ability to integrate with legacy systems.

Budgets are a key components of corporate performance management.  The most important thing to understand about IT budgets are that they assist in the establishment of strategic goals.  Systems provide data about the various level of utilization of resources.  An example question that a business client would pose to an IT manager would include:

What are the annual storage requirements of our Enterprise Billing System?

This question could be answered by tracking the amount storage consumed throughout the year based on the number of data sets stored in megabytes and looking at the interval of time that those data sets are stored.  From there an IT manager can translate that requirement in yearly terms, which in turn gives the budgeting team a metric of how much storage they need to purchase or maintain each year.

For large corporate firms in utilities, energy and manufacturing where literally, there could be hundreds of servers, there needs to be a more centralized structure for IT operations budgets.  The mandate given to IT managers in centralized IT Budget structures is to standardize and streamline multiple processes on hardware and software services.  The introduction of both private and public cloud architectures, and virtual architectures has made this possible.  Another question likely to be posed to IT managers:

Can our physical servers be migrated to a cloud or virtual infrastructure with higher performance and availability?

Having the right kind of analysis on current systems helps to ensure that dollars are spent appropriately when systems are consolidated or provisioned, and that they perform ideally according to business requirements.  IT managers are receiving pressure from executives to do more with less.  Data analysis has been a catalyst for innovation in cross delivery business development through the integration of systems and data.  Operational questions regarding IT include:

How much operational labor is expended providing IT services to an organization?

How much of the IT budget expended implementing changes to infrastructure?

Other budget concerns includes transitioning from a physical architecture to a cloud service based model.  Typically, with public cloud architecture, the resources are provisioned and managed by a hosting team.  Most cloud services will propose “elastic” solutions such as Amazon’s EC2 solution or Microsoft Azure which allows companies to use only what they need.  Therefore, the methodologies of sizing may not be as appropriate in such architecture.  However, in very data intensive industries where there are large scale architectures and multiple interaction of business and server processes, placing everything in a cloud domain is not only impractical, but very expensive and potentially illegal.  For example, in the utilities industry, state regulations may prohibit customer data from being off site.  An energy company’s proprietary information stored in an international data center that does not recognize the source country’s regulatory body could represent a public trust violation.

If migrating from a multi-tier architecture to a complete cloud-base services, it’s important to understand the type of cost involved.  Cloud based services typically have subscription model, where all the management, configuration and provisioning (unless self-provisioned) is handled by the hosting company.  There is a contract that specifies a level of service and support and that cost reflects how many resources the company is utilizing and the level of service for which to service its customers.  Payment terms can be yearly and quarterly, and there is usually a renewal date when payment is due [20].

The IT Values Proposition

IT value measures the worth and effectiveness of business technology solutions.  It is mostly a subjective assessment of how a business measures its assets when it pertains to business goals.  Value in information technology is typically defined in Return on Investment (ROI) and Key Performance Indicators (KPI) and other economic terms.   IT is most valuable when tied to business goals and objectives.  Adding value to IT also includes ensuring that IT assets are part of a data analytics ecosystem.  A data analytics ecosystem is where IT assets generate insight into how businesses produce, collect, store and learn from data and data analytics.  Data analytics is an important part of the IT value proposition, because of the tremendous treasure trove of knowledge and insight that can be gained from it.  A data analytics ecosystem helps to create processes to turn data into actionable business decisions.

Other best practices in IT value includes:

  • Evaluating the corporate business model in order to promote innovation.
  • Have strategic themes around data collection, dissemination and analysis.
  • Get the right people involved. This can include data scientist, engineers, business analysis, and many others.