But the pressure and the challenge to drive business impact are daunting in this climate. How do you stimulate growth while making large investments in future technologies without dramatically changing your business model? Companies are watching their operational costs balloon as they dip their toes into numerous areas of investment that require significant and often disparate expertise. Meanwhile, small start ups with incredible focus and no prior obligations can leverage new technologies in ways that established competitors struggle to answer.
So how do you protect yourself from disruption? How do you innovate without radically increasing the cost of doing business? It all boils down to one simple question: Do you feel secure in the tools you’re using? That’s the magic question, whether it’s your personal finances, career, or the engineering systems of the future. For instance, the IIoT ushers in a new era of both networked potential and significant risk.
Most test and measurement vendors have been slow to respond to the inevitable rise of software and are just now hitting the market with software environments that help the engineering community. But even those can only get you so far. As the industry continues to evolve, the tools engineers use to design these connected systems must meet four key challenges: productivity through abstraction, software interoperability, comprehensive data analytics, and the efficient management of distributed systems.
Productivity through abstraction
Simply put, abstraction is making the complex common. In the world of designing engineering systems, complexity often comes from programming. The custom logic that adds the smart to smart systems typically requires a level of coding that’s often so complex. The complex must become common, though. To solve this challenge, engineers need a ‘programming optional’ workflow that enables them to discover and configure measurement hardware, acquire real-world data, and then perform data analytics to turn that raw data into real insight. NI is introducing a new configuration-based workflow in the form of LabVIEW NXG. It is complemented by the graphical dataflow programming paradigm native to LabVIEW and known for accelerating developer productivity in complex system design for nearly 30 years. With this configuration-based interaction style, you can progress from sensor connections all the way to the resulting action without the need for programming and still construct the code modules behind the scenes. That last step is a critical feature that streamlines the transition from one-off insights into repeatable and automated measurements.
Software interoperability
With the growing complexity of today’s solutions, the need to combine multiple software languages, environments, and approaches is quickly becoming ubiquitous. However, the cost of integrating these software components is considerable and continues to increase. Languages for specialised hardware platforms must be integrated with other languages as these compute platforms are being combined into single devices. The solution to this is typically the design team assuming the burden of integration. However, this is essentially just treating the symptoms and not addressing the root cause. The software vendors must fix the root cause.
By design, NI’s software-centric platform places this software interoperability at the forefront of the development process. Though LabVIEW has been at the centre of this software-centric approach, many complementary software products from other companies are individually laser-focused on specific tasks, such as test sequencing, hardware-in-the-loop prototyping, server-based data analytics, circuit simulation for teaching engineers, and online asset monitoring. These products are purposefully limited to the common workflows of the engineers and technicians performing those tasks. This characteristic is shared with other software in the industry tailored to the same purpose. However, for NI software, LabVIEW provides ultimate extensibility capabilities through an engineering-focused programming language that defies the limitations of tailored software. For example, consider DAQExpress.
Figure 2: The interoperability between products in NI’s software portfolio simplifies the sharing of IP and transfer of code for more complex development.
DAQExpress is new companion software for USB and low-cost plug-in NI data acquisition hardware that simplifies the discovery and configuration of hardware and provides access to live data in two clicks. All the configuration ‘tasks’ within this product are fully transferrable to LabVIEW NXG, which simplifies the transition from hardware configuration to measurement automation.
In addition to interoperating within the NI platform, products like LabVIEW 2017 feature enhanced interoperability with IP and standard communications protocols. For embedded systems that need to interoperate with industrial automation devices, LabVIEW 2017 includes native support for IEC 61131-3, OPC-UA, and the secure DDS messaging standard. It also offers interactive machine learning algorithms and native integration with Amazon Web Services.
Comprehensive data analytics
Perhaps the most prolific benefit of the mass connectedness between the world’s systems is the ability to instantaneously access data and analyse every data point you collect. This process is critical to automating decision making and eliminating preventable delays in the necessary corrective action when data anomalies happen. To create the future network that can support this need, billions of dollars are being poured into research as algorithm experts from around the globe race to meet the demands of 1ms latency coupled with 10Gbit/s throughput. This direction introduces new demands on software. The first is to ensure that the processing elements can be easily deployed across a wide variety of processing architectures and then redeployed on a different processor with minimal –hopefully zero – rework. The second is to be open enough to now interface with data from an infinite number of nodes and via an infinite number of data formats.
NI has invested in server products that allow you to intelligently and easily standardise, analyse, and report on large amounts of data across your entire test organisation. A key component is providing algorithms to preprocess files and automatically standardise items such as metadata, units, and file types in addition to performing basic analysis and data quality checks. Based on that data’s contents, the software can then intelligently choose which script gets run. This type of interface is critical to eliminating the complexity of real-time data analytics so you can focus on what matters: the data.
Distributed systems management
The mass deployment and connectedness of these systems have renewed the need to efficiently manage all the distributed hardware from a centralised - and often remote - location. Today, this typically requires replicating single-point deployments across hundreds, and even thousands, of systems. Centralising the management then leads to the ability to see a real-time dashboard of the hardware from the remote depot instead of physically accessing the system.
Figure 3: SystemLink introduces a web-based interface to manage distributed hardware systems.
SystemLink from NI is software that helps you centralise the coordination of a system’s device configuration, software deployment, and data management. This reduces the administrative burden and logistical costs associated with systems management functions. The software also improves test and embedded system uptime by promoting awareness of operational state and health criteria. It simplifies managing distributed systems and provides APIs from LabVIEW and other software languages such as C++.
Beyond the individual innovation within each of these product releases, the collection represents the culmination of the ongoing investment in software that NI has proven committed to year after year.
Author profile
Jeffrey Phillips is principal software product marketing manager at NI.