Category Archives: Big Data

Chancellor Merkel proposes tax on big data

Please share this article:

German Chancellor Angela Merkel proposed a big data tax that would treat data as a raw material used in production at this week’s Global Solutions policy forum in Berlin.

The proposed reform is intended to reduce tax disparities between digital and traditional companies.

“The pricing of data, particularly the data of consumers, is the central issue that we need to solve in order to ensure a fair and equitable world of the future,” Merkel said.

With this week’s implementation of the General Data Protection Rules (GDPR) throughout the EU, an intersection of politics, business, and privacy is receiving global attention from the governments of other countries, regular citizens and privacy advocates, and international corporations that may be affected by changes in rules and regulations.

GDPR is intended to ensure that individuals are aware of what data is being collected, and how personal data is stored, used, and transferred. While sometimes criticised as putting too great a burden on companies, it is now being watched closely by the international privacy community.

In fact, Microsoft has expanded the GDPR rights of its EU customers to all customers worldwide, according to a statement issued earlier this week.

Merkel’s plan

Merkel’s big data tax proposition is based on the idea that intangible data has a tangible value as a raw material, used in conducting business in the modern environment. She called upon researchers from several think tanks and policy groups attending the forum to create concrete proposals to determine the value of data, and tax it as a material used in production.

The intention for creating a data tax is to close the gap between traditional business models and digital business models; creating a fair and equitable standard that involves an understanding of the true value of big data in the current business world.

Taxation of digital enterprises has been a topic of much discussion throughout Europe, with companies such as Apple, Facebook, Google, and Amazon facing charges of tax avoidance for funnelling sales through countries with attractive, lower tax rates. Billions of euros in back taxes are currently being repaid to countries by tech giants that used multinational structures to avoid paying higher tax rates in the countries where taxes were due.

In January, French President Emmanuel Macron called for countries throughout the EU to cooperate in creating a comprehensive big data policy, to better compete with American and Chinese markets. Macron, however, did not address tax reform for digital businesses at that time, focusing more on levelling the playing field between European businesses and international competitors.

EU Digital Tax Plan

In March 2018, the EU unveiled a digital tax plan to address tax avoidance as well as an estimated tax inequity, as EU officials estimate that digital businesses pay an average effective tax of 9.5%, compared to traditional businesses that pay 23.3% on average.

The EU Digital Tax Plan proposes a 3% tax be levied on the sale of user data. The tax is expected to generate €5 billion per year for EU member states, and help to balance the tax disparity between digital and traditional businesses.

The idea that data has value in itself is becoming more accepted among individuals as well as organizations and enterprises. Facebook is currently facing intense public scrutiny for data gathering practices, as people become more concerned with what information is gathered and kept, and what is done with personal information once it is in a corporation’s grasp.

The value of data is also expressed in its uses, both legal and illegal. Data is stolen by hackers and resold on the dark web; data is held for ransom by malicious entities, and people pay to retrieve it; and data is legally gathered, bought and sold to be used for targeted advertising, political analysis, healthcare (public and private), science and research, and law enforcement organizations.

While the EU Digital Tax Plan and Merkel’s proposed big data tax are both based in the understanding that data has value, there are marked differences between the two ideas. The EU Digital Tax Plan addresses the sale of data only, leaving it to corporations to assign value to the data being sold. Merkel’s plan to treat data as a tangible raw material faces the challenge of an overarching, comprehensive data valuation by the government instead. Additionally, there is debate over whether data itself has actual value, or whether the value is in the algorithms used in data analytics, or in the business application and results garnered through data analytics.

Please share this article:

Big data used to map stress responses in corn

Please share this article:

Plant scientists at Iowa State University have completed a new study that describes the genetic pathways at work when corn plants respond to stress brought on by heat — a step that could lead to crops that are better capable of withstanding stress, according to an announcement.

The findings, published as a “large-scale biology” paper in the academic journal The Plant Cell, map the stress response detected by the endoplasmic reticulum, an organelle in cells of corn seedlings. The research was a multilevel study in which the scientists analyzed massive data sets to account for the expression of tens of thousands of plant genes, Iowa State said. The size of the study required a multi-institutional effort that included scientists at Iowa State, Michigan State University and the University of North Carolina-Wilmington.

A better understanding of how corn plants cope with stress can help plant breeders engineer crops that can better tolerate and continue to produce under stressful conditions, said Stephen Howell, a distinguished professor of genetics, development and cell biology at Iowa State and senior author of the study.

The endoplasmic reticulum plays a key role in this stress response, because it is the subcellular location where many proteins are folded. Proteins acquire their function based on the shape in which they’re folded, but stressful conditions such as high heat cause proteins to be misfolded, and misfolded proteins can be toxic to cells, Howell explained.

“Protein folding is a very delicate process that’s easily upset,” Howell said. “We want to understand the mechanisms of the stress response to find ways in which we can intervene to promote survival.”

The researchers applied a chemical to corn seedlings to mimic stressful environmental conditions and then tracked the activity of around 40,000 genes using several high-throughput technologies. This is one of the first studies on maize stress to be carried out at this level, said Renu Srivastava, an assistant scientist in the Iowa State Plant Sciences Institute and a co-author of the study.

The scientists exposed the plants to persistent stress and found the plants could adapt — at least for a time. However, with persistent stress, the cells eventually “give up,” which quickly leads to cell death, Srivastava said.

Mapping that transition from cell survival to death could lead the way to methods of prolonging or strengthening stress adaptation, she said.

The research parallels similar work in human health, Howell said. Protein misfolding also occurs in humans and can result in neurological diseases such as Parkinson’s and Alzheimer’s. Howell said studying protein misfolding in plants may illuminate how other organisms respond under similar circumstances.

Please share this article:

The top five tweets: From financing to big data — what did EPM readers enjoy reading this week?

Please share this article:

Here we count down the top five tweets from @EPM_magazinethat have gained the most impressions this week, taken from Twitter analytics.

5. Guiding you through how to best prepare for the impact of big data in healthcare, Manuel Duval’s (Scientist.com) latest article makes it onto our list at number five.

4. In fourth place is the recent article from SEA Vision on the benefits of aggregation and how in combination with serialisation it can be most beneficial to pharma companies.

3. Disrupt or be disrupted? Our latest article from DHL Supply Chain makes it onto our Top Five at number three.

2. This week’s runner up position goes to the announcement that the FDA has accepted the supplemental new drug application for Novartis’ Promacta.

1. Number one this week goes to the news that UK biotech, Orbit Discovery, closed its series A financing round, raising £6.9 million to expand its platform.

Please share this article:

Big Data Toronto Brings Canada to the Centre Stage in Big Data and AI

Please share this article:

TORONTO, June 1, 2018 /CNW/ – The Big Data and AI Toronto Conference and Expo is back for its 3rd edition, expanding in both size and scope with a co-located conference. While Big Data focuses on the skills, software and leadership needed to implement data insights, AI Toronto is dedicated to Toronto’s growing AI and deep learning communities. This unique 2-in-1 learning experience will take place on June 12 and 13, 2018 at the Metro Toronto Convention Centre.  

This year’s conference will host more than 4500 participants, 100 speakers, and 60 exhibiting brands. Speakers include innovators from Reddit, Uber, Mercedes-Benz, Sidewalk Labs, Shopify, Twitter, Modiface, NHL, Yelp, TheScore, Ecobee, to Toronto-grown game-changers such as Coinsquare and Layer6. “We created a program that not only provides inspiring and actionable content around Big Data and AI, but one that also reflects the amazing talent and the ground-breaking achievements of our city”, says Dina Al-Wer, Program Manager. “We are giving attendees the environment to network with and learn from the best and the brightest in the field”.

According to a McKinsey&Co. study, firms that take advantage of big data analytics could increase their operating margins by more than 60%. Yet, as the MIT Technology Review states, only about 0.5% of digital data is analyzed and used. This is where the Big Data Toronto Conference comes in. They bring together Canada’s top players in Big Data and AI for two full days of education, networking, and product demos to help attendees thrive in the digital age.

The event also addresses the dark side of these new technologies and will guide attendees on how to navigate this complex field with an ethical, secure, and inclusive approach. More than 10 sessions at the event address the integration of AI in the workforce, regulations, privacy, cybersecurity, and the ethics of data and AI.

The two new exciting components featured this year are the Business Meetings Platform and AI Startup Battle. The addition of the Business Meetings Platform allows attendees to connect with the hundreds of professionals, organizations, and companies present through a matchmaking algorithm. As well, AI Toronto will host the first-ever AI Startup Battle featuring the top 5 AI-driven startups chosen by a panel of expert judges and mentors. “Canada is at the heart of the AI revolution thanks to its breakthroughs in deep learning. We are already seeing remarkable home-grown startups and we wanted to give them a platform to present their solutions”, says Olivia Kitevksi, AI Program Manager.

Expo passes are available for free and grant attendees access to more than 50 sessions and 60 exhibiting brands, including the industry’s top players like IBM, SAS, AWS, Microsoft, MariaDB, Vertica, Informatica. Due to limited capacity, organizers encourage you to book your passes as soon as possible. 

To learn more about the Big Data and AI Toronto program and registration, you can visit: https://www.bigdata-toronto.com/

SOURCE Big Data Toronto

For further information: Media Contact: Kirk Jennings, Marketing Manager, Corp Agency, kjennings@corp-agency.com

Organization Profile

Please share this article:

Plant scientists use big data to map stress responses in corn

Please share this article:

Credit: CC0 Public Domain

Plant scientists at Iowa State University have completed a new study that describes the genetic pathways at work when corn plants respond to stress brought on by heat, a step that could lead to crops better capable of withstanding stress.

The findings, published as a “large-scale biology” paper in the academic journal The Plant Cell, map the stress response detected by the , an organelle in cells of corn seedlings. The research was a multilevel study in which the scientists analyzed massive datasets to account for the expression of tens of thousands of plant genes. The size of the study required a multi-institutional effort that included scientists at Iowa State, Michigan State and the University of North Carolina, Wilmington.

A better understanding of how cope with stress can help plant breeders engineer crops that can better tolerate and continue to produce under stressful conditions, said Stephen Howell, a Distinguished Professor of genetics, development and cell biology and senior author of the study.

The endoplasmic reticulum plays a key role in this stress response, because it is the subcellular location where many proteins are folded. Proteins acquire their function based on the shape in which they’re folded, but stressful conditions such as high heat cause proteins to be misfolded, and misfolded proteins can be toxic to cells.

“Protein folding is a very delicate process that’s easily upset,” Howell said. “We want to understand the mechanisms of the to find ways in which we can intervene to promote survival.”

The researchers applied a chemical to corn seedlings to mimic stressful environmental conditions and then tracked the activity of around 40,000 genes using several high throughput technologies. This is one of the first studies on maize stress to be carried out at this level, said Renu Srivastava, an assistant scientist in the ISU Plant Sciences Institute and a co-author of the study.

The scientists exposed the to persistent stress and found the plants could adapt – at least for a time. However, with persistent stress the eventually “give up,” which quickly leads to cell death, Srivastava said.

Mapping that transition from cell survival to death could lead the way to methods of prolonging or strengthening adaptation, she said.

The research parallels similar work in human health, Howell said. Protein misfolding also occurs in humans and can result in neurological diseases such as Parkinson’s and Alzheimer’s. Howell said studying misfolding in plants may illuminate how other organisms respond under similar circumstances.

Explore further: Research leads to understanding of how crops deal with stress — yield’s biggest enemy

More information: Renu Srivastava et al. Response to Persistent ER Stress in Plants: a Multiphasic Process that Transitions Cells from Prosurvival Activities to Cell Death, The Plant Cell (2018). DOI: 10.1105/tpc.18.00153

Please share this article:

Is Congress Using Data Analytics to Woo Voters for 2019 Lok Sabha Polls?

Please share this article:

“We will only deal with public data and on rare occasions, private data but with consent,” says Praveen Chakravarty, Chairperson – Data Analytics Department, INC

June 1, 2018 4 min read

You’re reading Entrepreneur India, an international franchise of Entrepreneur Media.

Opinions expressed by Entrepreneur contributors are their own.

Big Data is the buzzword for 2018. From the Facebook-Cambridge Analytica fiasco to the European Union implementing its General Data Protection Regulation, also known as GDPR, big data has managed to show its good, bad and ugly colours to the world.

And India is not far behind in the game as the ball has moved beyond the startups and corporate courtyard. We now see one of the oldest political party in the country – Indian National Congress (INC) setting up a data analytical department to reach out to its voters in more effective and scientific manner.

The Need

Recalling the day, when the Congress President Rahul Gandhi formed the data analytical department in the party, members would often walk up to Praveen Chakravarty, Chairperson – Data Analytics Department, INC and ask whether other political parties have any such department. His response would be straightforward, “To my knowledge, I think we are the only ones in India and may be the only one around the world to have a formal department as such within the party.”

For Chakravarty, Indian politics is perhaps the largest producer and consumer of data in India.

Take an example of India’s geographical context, the country is divided into 7 lakh villages, 600 districts, 36 states and union territories, but guess what – the most micro units of information of what you get is through a polling booth. “There were nearly a million polling booths in the country in 2014 elections where you get information of how 600 million voters. You can actually get data of how each booth has voted. For every single election since 1952, imagine the size and the quantum of data. This is really big data and it reveals preferences of people,” he points out.

Discussing the seriousness of the party at Data Science Congress 2018, the politician-cum-economist said, “Our approach to data is pure science. For us, data is important to  make a decision and is not something that we think of when there is an election coming up and then we forget about it after the election.”

The Struggle

On a typical day, Chakravarty gets bombarded with all sorts of data from images to videos to audios and numbers. This is then cleaned, processed and converted into meaningful information which the party can use to make decisions within Congress or electorally.

“Data can be used to make decisions like who should we give tickets to, who should we ally with, what should our stand be, how to convince our voters and that’s why I always felt the use of data is perhaps  acute in politics, especially in the Indian context which is so complex and diverse that it can be a game changer,” the former investment banker added.

However, the challenges of the Congress are in the line with any startup working with data analytics. Even though India is heaven for a data scientist, most of the data in the country is unstructured and hence, it is not readily usable and the party continues to struggle to process audio-based data.

Déjà vu?

Facebook-Cambridge Analytica ruckus was a clear example of how data can be misused by stakeholders to manipulate voters. With Indian Loksabha election due in 2019 and as the country struggles to outline laws for data security and data privacy, each one of us would surely raise an eyebrow on Congress’ move.

Rubbishing any such potential data breach, Chakravarty said to Entrepreneur India, “We are very clear here that we will only deal with public data and on rare occasions private data with consent. There has been no breach in the Congress party previously nor there will be in the future. We are very strict about these issues.”

This is why Chakravarty says the Congress has not appointed an external vendor to process the data and instead it is working with professionals and a bunch of party volunteers.

Post the massive fall on 2014 Lok Sabha election, with the help of big data the party is hoping to sweep the next Loksabha poll. But only time will tell whether India will witness Cambridge Analytica 2.0 situation or dawn of Congress 2.0

Please share this article:

Big data and local government

Please share this article:

The best big data projects start small

Much of the hype around big data carries an image of large companies using esoteric hardware and software to run incredibly complex analytics. But the reality is that many big data projects are actually quite small, use straightforward technology and achieve real world aims.

Today, thanks to the cloud, it is more than possible to run smaller scale projects with minimal up-front investment. Local authorities are already showing interesting results in areas from early intervention and social care to traffic management and recycling and rubbish disposal.

Get the fundamentals right first

It is crucial to get the basic infrastructure in place before embarking on a big data project.

It is about getting the fundamentals right, not building a massive data centre and hiring lots of people in white coats to run it.

The key barrier for most organisations is improving on existing infrastructure to allow better access to the data that is already being collected. Local authorities are no different to most organisations in finding that data is often stuck in siloes within departments, or even individual business units, and is almost impossible to access and use. Breaking open these siloes requires cultural as well as technological change. But cloud technology and software-based networking can make this process far easier than it was in the past.

Investing in open source, cloud-based infrastructure can help create systems where data can be accessed by applications across the organisation or by partners outside, as long as there is cultural support too.

Once data is available, it needs to be checked to ensure it is clean and fit for purpose. This is another vital step which can often be overlooked.  Most big data projects need to spend a sizeable fraction of their budgets on data processing and cleansing before any analytics is even attempted. This process allows you to get the very best out of data you own, only then do you need to consider if you need to import data from elsewhere.

Once you have the infrastructure and clean data sets in place then the real work of analysis can begin. Again cloud technology can help here with lots of cloud apps available to help with early stage analysis.

Although the implications of moving to a more data-driven organisation can be profound some of the most successful examples start with quite modest goals.

Identifying a single issue and using data to solve it can help prove the usefulness of big data strategies to the whole organisation and win over doubters. Data science is a complex and specialised business and most organisations find it easier to work with a partner with relevant skills.

It is also crucial to ensure the right data governance and privacy controls are in place from the very start of any project. Again ensuring your fundamental infrastructure is sound and solid will make compliance with GDPR and other regulations much easier.

Start simple

Northamptonshire County Council took such a step-by-step approach in order to reduce congestion and make spending on transport more effective. The council hired the University of Northampton to collate data on journeys made by students, health care workers, patients and council staff. In total this provided information on journeys made by 32,000 people, a sizeable proportion of total travel in the county.

Quite simple analysis found that almost half of dedicated patient transport to hospital could be provided in other ways. Analysis of other council transport contracts found spare capacity of almost 1000 places which could provide more cost effective travel for some of these patients.

Over and above this, the project, oneTRANSPORT, is also helping to inform future decision making around expanding the university to ensure that transport and congestion implications are properly considered and planned for. The council is now partnering with other local authorities and technology providers to deepen the understanding available.

Building on the success of such projects allows local government to take a more innovative approach to how services are delivered. There are big opportunities not just for savings but also for radical improvements in services in areas like social care and early intervention policies.

Next steps

Innovation charity Nesta’s discussion paper “Datavores of Local Government” looked at some emerging trends in how local councils are using their data to provide savings and improved services. One important shift is a move to predictive analytics. Moving spending from solving problems when they occur to providing them earlier has long been a goal for local authorities. Historically this has been stymied because the financial burden can often shift to another organisation and it has been difficult to accurately measure the success of such early intervention.

Predictive analysis can help remove these barriers.

In the US a nationwide NGO is using predictive analysis to identify which expectant mothers would most benefit from visits from specialist nurses during pregnancy and the first two years of the child’s life. By targeting resources and testing the best time for intervening the charity is able to fine tune how its resources are spent and improve outcomes for both mother and child. This sort of predictive analytics can offer radical solutions by allowing authorities to intervene before rather than after problems arise.’

Big data is not going away

There is no way for local authorities to ignore the ever increasing wave of big data.

As more services shift online, more communication with residents is digital, more council staff use mobile devices and more infrastructure is linked via the Internet of Things – so the scale of the data lake will continue to grow.

Getting the right infrastructure in place means rather than councils just coping with this increase in data, they can take real advantage and find genuine value in the insights it can provide.

In the next few years how local councils deliver services, and even the types of services they offer, will be profoundly changed by the impact of data analysis and also machine learning.

Embracing this will enable services to continually evolve and improve as citizens’ needs change.

 

Please share this article:

Big data and local government

Please share this article:

The best big data projects start small

Much of the hype around big data carries an image of large companies using esoteric hardware and software to run incredibly complex analytics. But the reality is that many big data projects are actually quite small, use straightforward technology and achieve real world aims.

Today, thanks to the cloud, it is more than possible to run smaller scale projects with minimal up-front investment. Local authorities are already showing interesting results in areas from early intervention and social care to traffic management and recycling and rubbish disposal.

Get the fundamentals right first

It is crucial to get the basic infrastructure in place before embarking on a big data project.

It is about getting the fundamentals right, not building a massive data centre and hiring lots of people in white coats to run it.

The key barrier for most organisations is improving on existing infrastructure to allow better access to the data that is already being collected. Local authorities are no different to most organisations in finding that data is often stuck in siloes within departments, or even individual business units, and is almost impossible to access and use. Breaking open these siloes requires cultural as well as technological change. But cloud technology and software-based networking can make this process far easier than it was in the past.

Investing in open source, cloud-based infrastructure can help create systems where data can be accessed by applications across the organisation or by partners outside, as long as there is cultural support too.

Once data is available, it needs to be checked to ensure it is clean and fit for purpose. This is another vital step which can often be overlooked.  Most big data projects need to spend a sizeable fraction of their budgets on data processing and cleansing before any analytics is even attempted. This process allows you to get the very best out of data you own, only then do you need to consider if you need to import data from elsewhere.

Once you have the infrastructure and clean data sets in place then the real work of analysis can begin. Again cloud technology can help here with lots of cloud apps available to help with early stage analysis.

Although the implications of moving to a more data-driven organisation can be profound some of the most successful examples start with quite modest goals.

Identifying a single issue and using data to solve it can help prove the usefulness of big data strategies to the whole organisation and win over doubters. Data science is a complex and specialised business and most organisations find it easier to work with a partner with relevant skills.

It is also crucial to ensure the right data governance and privacy controls are in place from the very start of any project. Again ensuring your fundamental infrastructure is sound and solid will make compliance with GDPR and other regulations much easier.

Start simple

Northamptonshire County Council took such a step-by-step approach in order to reduce congestion and make spending on transport more effective. The council hired the University of Northampton to collate data on journeys made by students, health care workers, patients and council staff. In total this provided information on journeys made by 32,000 people, a sizeable proportion of total travel in the county.

Quite simple analysis found that almost half of dedicated patient transport to hospital could be provided in other ways. Analysis of other council transport contracts found spare capacity of almost 1000 places which could provide more cost effective travel for some of these patients.

Over and above this, the project, oneTRANSPORT, is also helping to inform future decision making around expanding the university to ensure that transport and congestion implications are properly considered and planned for. The council is now partnering with other local authorities and technology providers to deepen the understanding available.

Building on the success of such projects allows local government to take a more innovative approach to how services are delivered. There are big opportunities not just for savings but also for radical improvements in services in areas like social care and early intervention policies.

Next steps

Innovation charity Nesta’s discussion paper “Datavores of Local Government” looked at some emerging trends in how local councils are using their data to provide savings and improved services. One important shift is a move to predictive analytics. Moving spending from solving problems when they occur to providing them earlier has long been a goal for local authorities. Historically this has been stymied because the financial burden can often shift to another organisation and it has been difficult to accurately measure the success of such early intervention.

Predictive analysis can help remove these barriers.

In the US a nationwide NGO is using predictive analysis to identify which expectant mothers would most benefit from visits from specialist nurses during pregnancy and the first two years of the child’s life. By targeting resources and testing the best time for intervening the charity is able to fine tune how its resources are spent and improve outcomes for both mother and child. This sort of predictive analytics can offer radical solutions by allowing authorities to intervene before rather than after problems arise.’

Big data is not going away

There is no way for local authorities to ignore the ever increasing wave of big data.

As more services shift online, more communication with residents is digital, more council staff use mobile devices and more infrastructure is linked via the Internet of Things – so the scale of the data lake will continue to grow.

Getting the right infrastructure in place means rather than councils just coping with this increase in data, they can take real advantage and find genuine value in the insights it can provide.

In the next few years how local councils deliver services, and even the types of services they offer, will be profoundly changed by the impact of data analysis and also machine learning.

Embracing this will enable services to continually evolve and improve as citizens’ needs change.

 

Please share this article:

Taking Predictive Maintenance from the IIoT to Big Data Analytics

Please share this article:

Predictive maintenance can grow far beyond traditional condition monitoring when the data from equipment is gathered through the Industrial Internet of Things (IIoT) and then stored and processed through Big Data analysis systems, such as IBM’s Watson. “When you gather the regular maintenance data, you can build a history of the data. Then you use algorithms to detect anomalous behavior in the historical data. In time, you learn that when you see this anomaly, you know—based on the history—that this component is likely to fail in the next 10 to 17 days,” Tom Craven, VP of product strategy at RRAMAC Connected Systems, told Design News. “The analysis of the data can predict the very specific failures in a specific timeframe. That’s where IBM Watson comes in.”

Sensors can gather equipment condition data and send it via the IIoT to a service company that can use it for predictive maintenance. (Image source: RRAMAC)

Craven will present a session at the Atlantic Design and Manufacturing Show in New York City on June 14 with Kayed Almasarweh, the Watson and cognitive IoT solutions lead at IBM. The program, Leveraging IoT for Predictive Maintenance, will look at the combination of condition monitoring data collection and the analysis of that data via Big Data processing in IBM’s Watson.

Grabbing the Quick ROI

Before customers make the major jump into Big Data processing, they can enjoy an early return on investment (ROI) from predictive maintenance basics—the stream of equipment data that comes from sensors and is delivered to a condition monitoring system via the IIoT. ROI can be achieved more quickly if the company doesn’t have to set up the servers and configure the software. Service companies like RRAMAC can grab the sensor data over the internet and process it on remote servers. “When you’re not installing a bunch of software and spending time learning how to configure it in-house, it shortens the timeline to the ROI,” said Craven. “The initial investment is less when you don’t have to invest in all the development hours to get it going.”

The reduced investment allows companies that couldn’t otherwise afford to develop a condition monitoring system to reap the benefits of predictive maintenance. “The IIoT brings predictive maintenance to a whole new set of customers where it wouldn’t have made sense before,” said Craven. “A lot of companies can benefit from predictive maintenance even if it doesn’t make sense for them to do it on their own.”

Giving the OEMs Their Own Equipment Data

Not all predictive maintenance data needs to go directly to the end user. In some cases, the end customer is using a piece of equipment that’s not connected to a factory line. Examples can include a recycling machine or a rock crusher. The user doesn’t have the network to gather equipment data, so the equipment OEM can track the machine data and monitor the equipment’s health. “Sometimes, our customer is the OEM. The OEM gets the information. The customer may get the information as well, including the alerts,” said Craven. “We provide data to the OEM and if the OEM chooses, the OEM can provide the data to the customer.”

RELATED ARTICLES:

OEMs often sell extended warranties. But the OEM can only sell the extended warranty if the health of the machine can be monitored. “If you have a machine that requires maintenance, those machines can wear out quickly if they’re not maintained,” said Craven. “If the OEM is monitoring the equipment regularly and making sure the customer is doing regular maintenance, the OEM can extend the warranty knowing that it’s enforced. It’s just like Ford not supporting the warranty on a car that hasn’t had regular oil changes.”

Rob Spiegel has covered automation and control for 17 years, 15 of them for Design News. Other topics he has covered include supply chain technology, alternative energy, and cyber security. For 10 years, he was owner and publisher of the food magazine Chile Pepper.

Atlantic Design & Manufacturing, New York, 3D Printing, Additive Manufacturing, IoT, IIoT, cyber security, smart manufacturing, smart factory INSPIRE. COLLABORATE. INNOVATE. 
Atlantic Design & Manufacturing, part of the largest advanced design and manufacturing industry event on the East Coast, is the annual must-attend trade show for discovering the latest in design engineering. Source from the region’s most comprehensive collection of cutting-edge suppliers, deepen your expertise with free, conference-level education, and network with thousands of professionals who can help you advance your projects — and your career. From prototyping to full-scale production, one lap of the show floor will help you overcome your toughest manufacturing challenges and keep you up to speed on innovations transforming the industry. Everything you need to take projects to market faster and more cost effectively is here. Click here to register for your free pass today!
Please share this article: