All Blog Posts
IoT on the Edge: Opportunities and Challenges in Distributed Computing
The Internet of Things (IoT) is getting bigger every day, and as it does devices with computing power are simultaneously getting physically smaller and spreading out. Computers are no longer just processors with screens and keyboards – they are phones, watches, home appliances, digital assistants, sensors, and even biometric implants. The proliferation of these technologies has opened up a world of opportunities as well as a veritable Pandora’s Box of unintended consequences. Edge computing leverages the power of distributed resources to solve many of the data processing challenges created by the growing number and significance of smart technologies.
The Proliferation of Devices into an “Internet of Things”
As time goes on, more people have greater access to a wider variety of computing technologies. There are more than five billion smartphones, two billion personal computers, and one billion tablet devices in use today – and these figures do not even account for the increasingly diverse array of internet-connected devices available for home and personal use. From the cars we drive to the appliances in our homes, the network of internet-connected objects around us is rapidly proliferating. These billions of devices make up the Internet of Things (IoT), and it’s shifting the future of computing.
The Internet of Things (IoT)
By definition, the IoT is “a system of interrelated computing devices, mechanical and digital machines, objects, animals, or people that are provided with unique identifiers and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction.” In the physical world, the IoT is represented by the increasing number of objects capable of collecting and exchanging data.
The IoT helps people and businesses live and work smarter by connecting all of our smart devices. It includes a vast array of technologies, including smart devices currently used in homes, healthcare, finance, retail and manufacturing, labor, transportation, agriculture, government and military, and technologies that measure waste and energy consumption.
Home and personal devices are getting smarter and more accessible, and there is a lot of excitement surrounding consumer technologies in this space. While new innovations in wearable, portable, or in-home devices often take the spotlight, such applications of IoT technology pales in comparison to the multitude of enterprise and industrial applications. In particular, 2019 saw major development in the Industrial Internet of Things (IIoT), which is made up of devices primarily designed for complex enterprise applications. By relying on smart devices and sensor technologies, IIoT allows for engineering and manufacturing organizations to gather data continuously while also identifying positive and negative patterns. When used collaboratively with decentralized computing methods, this practice increases efficiency and effectiveness of industrial activities while also identifying and solving problems more quickly.
Industry and Enterprise Embrace the IoT
The efficiency and functionality gains made possible through the integration of IoT into business operations has led to a massive increase in the adoption of this technology among enterprises. Between 2014 and 2019, the number of businesses that used IoT technologies nearly doubled. By 2019, the IIoT was used or being consideration by an estimated 82% of U.S. industries. By 2023, analysts project that the number of IoT-connected devices worldwide will nearly triple.
This interest has primarily been stimulated by constant improvements in the quality and application of new virtualized productivity tools, such as digital twins. Digital twinning enables engineers to make a virtual version before expending money to make it real. This allows professionals to discover the design flaws earlier in the process even more effectively than 3D printing, which also allows quick and efficient prototype production.
Digital twins are just one novel application of devices in the IoT ecosystem, but innovators find new ways to leverage virtualization technologies to interact with internet-connected objects in increasingly value-driven ways every day. As a particular system or process is installed, updated, operated, or maintained, IoT devices or human operators in the field collect data. This data can be sent to a specialized virtual asset that evaluates and analyzes the information to develop recommended actions to improve the system. If and when the recommended action is initiated, the cycle begins again. As this process continues, the systems across enterprises and industries are able to improve by leaps and bounds.
The data collection and transfer facilitated by IoT devices is opening new opportunities for enterprise across a diversity of markets, and it is a space in which forward thinkers are constantly breaking new ground. Customized IoT devices and systems are created every day to serve unique personal and enterprise needs. As such, IoT applications are expanding.
The possibilities in this space are stimulating some of the most creative minds in tech – but at the same time, they are raising novel issues. The massive network created by the Internet of Things is both challenging existing standards in internet privacy and cybersecurity and pushing the boundaries of existing data management and networking technologies.
Challenges Created by the Proliferation of Internet-Connected Devices
IoT technologies collect data locally, which is then transmitted to a centralized infrastructure for processing – often a cloud. On the cloud, data is processed and stored through a centralized infrastructure that manages network activity. This approach has provided a cost-effective and flexible solution for data processing for decades, but the rapid development of mobile devices and IoT applications is straining some cloud architectures. The most meaningful challenges created by the proliferation of IoT devices, particularly in enterprise and industrial applications, include concerns regarding interoperability, security issues, and data management challenges – all of which can be effectively addressed with distributed computing.
Lack of Interoperability
The universe of devices connected through the IoT engage with networks, clouds, and each other through a variety of protocols. The Internet Architecture Board (IAB) released guidance outlining four common communication models utilized by IoT devices: device-to-device, device-to-cloud, device-to-gateway, and back-end data sharing. Each of these communications models make it possible for us to engage with the IoT in dynamic and interesting ways, but they also present potential interoperability challenges. For an example, consider two common device-to-device communication protocols: Zigbee and Z-Wave. Devices connect to each other through one particular communication protocol or the other, so devices on the Zigbee protocol do not have the same security and trust mechanisms as those on the Z-Wave protocol. As a result, one family of devices is incompatible with the other.
The protocol-specific nature of device-to-device communications raises interoperability challenges that will also incentivize the rise of a single protocol (such as Bluetooth) by which all devices on the IoT may connect, greatly limiting competition in a field not yet regulated by effective standards. The same risk of vendor lock-in attaches to device-to-cloud communications, which is how common devices like the Nest Labs Learning Thermostat and Samsung’s SmartTV function. These devices connect to their parent company’s clouds, which expands the functionality of the device beyond what is supported by the device itself. However, because the communication protocols between the device and the cloud are established and managed by the device’s manufacturers, one device connected to one cloud cannot necessarily engage with another device connected to a different cloud.
Privacy and Security
When billions of devices are connected via live internet connection, security becomes a major concern. This is particularly true in the context of the internet-connected objects that make up the IoT, as manufacturers are quite literally left to their own devices. Some even choose to make devices incapable of upgrades necessary to fix undiscovered vulnerabilities, which is exactly what occurred at Fiat Chrysler in 2015. That year, the auto manufacturer was forced to recall 1.4 million internet-connected smart cars that could easily be hacked through their wireless connection.
Hackers accessing devices through their wireless connection is one security risk associated with the proliferation of the IoT, but the reciprocal risk also applies: access to physical devices may provide a means by which cyberattacks could be achieved. Although developer communities like Arduino and Raspberry Pi are advocating for security standards across the IoT as a matter of industry best practice, individuals and businesses utilizing IoT devices remain at risk of having their private information being accessed, stolen, or used nefariously.
The IoT generates a tremendous amount of data. This so-called “big data” is commonly siloed in isolation, particularly in the device-to-cloud model described previously. The countless bits of data moving across the IoT at any given moment place high demand on network bandwidth, and bottlenecks often arise in the systems and protocols designed to support communication of data from devices to centralized networks.
The IoT is expanding, and as it does it is straining network infrastructure. Non-PC devices accounted for an estimated 70% of all internet traffic this year alone, which is fantastic for improving access but challenging for network operators. The strain on network bandwidth this increasing connectivity is creating has been a major driver behind the development of new innovations designed to reduce the strain of IoT devices on network bandwidth – such as the Long Term Evolution (LTE) cellular radio access technology known as the Narrowband Internet of Things (NB-IoT). NB-IoT is a communication technology enabler suitable for long-range data transmission at relatively low costs, and it will likely support many of the billions of devices that will join the IoT over the coming decades. However, with a maximum bandwidth of 200 kHz, NB-IoT is not an effective solution for organizations with latency sensitivity.
As more IoT devices are connected to an enterprise network, end-user demand increases. Without distributed computing solutions, networks can suffer capacity constraints, stringent latency, resource-constrained devices, and intermittent connectivity. This poses a meaningful challenge to any enterprises that relies on IoT devices, and businesses are looking for ways to ensure their networks are capable of managing the big data that these objects will be transmitting.
New challenges may raise barriers, but they also present opportunities. Many technology professionals are turning to distributed computing methods to solve the security, efficiency, and latency problems networks are experiencing with the increase in communication demand. Particularly, edge computing approaches can be utilized to overcome the increasing constraints placed on cloud and similarly centralized network architecture by the ever-expanding IoT.
The Rise of Edge Computing
The proliferation of internet-connected mobile devices is forcing software developers to find new solutions for data storage and communication. These devices are constantly collecting and transmitting new data, which weighs heavily on network infrastructure. Edge computing architecture offers a more effective and efficient means of managing the massive amounts of data created by IoT devices.
Defining the Edge
Naturally, most of us think of “the edge” as a place. However, the edge is not a “where,” but rather a “what.” Specifically, the edge is a conceptual solution to the functional problems created by the proliferation and relevance of increasingly diverse types of digital data.
Edge computing is an emerging field in which developers program cloud computing systems capable of performing data processing at “the edge” of any given network rather than the center. In other words, edge computing uses distributed methods to process data near its source rather than relegating it to a centralized set of servers. The use and efficiency gains made possible through this approach may help solve some of the most difficult pain points in data management, particularly in industrial and enterprise applications.
As an example, consider the data collected and managed by a single turbine in a wind farm. Sensors on turbine equipment detect wind speed and direction, temperature, and other relevant data in real time. Rather than sending all of this information to the cloud, edge computing allows the system to store, manage, and apply onsite in a manner that is most relevant to the individual turbine. The data that is necessary to adjust the tilt and pitch of the individual turbine to optimize performance stays within the turbine itself, which houses the computing equipment system operators can use to perform functions and applications at the edge of the system. While this is happening in the turbine nacelles at the edge of the network, aggregate farm-wide data may be sent to the cloud for future use and analysis. In this case, leveraging the power of edge computing would allow the wind farm to apply the most relevant data in a manner that optimizes performance based on location.
Edge Computing Architectures
All edge computing is distributed in a manner that’s analogous to the example provided above, but there are a number of approaches developers utilize to capture the benefits of this model. The three most typical edge computing architectures are fog computing, mobile edge computing, and cloudlets.
Fog computing was initiated by Cisco in 2012. It is a system-level horizontal architecture capable of distributing computing, storage, and control functionalities anywhere between a centralized cloud and a device on the edge of the network. Fog computing supports a standardized framework capable of handling the data-intensive requirements of expansive IoT applications.
As a distributed resource, fog computing provides useful tools for monitoring, managing, and securing resources across networks and between devices. Unlike the other architectures discussed below, however, fog computing works with the cloud to improve functionality and decrease latency by leveraging local ISP servers as an intermediary.
Mobile Edge Computing
Mobile Edge Computing (MEC) is a key architectural concept in edge computing and is viewed as a critical component for the enablement of IoT functionality for the future. By definition, MEC provides an IT service environment with cloud-computing capabilities at the edge of a mobile network. Specifically, this architecture involves MEC applications that run as Virtual Machines (VM) on top of a virtualized infrastructure. These applications can interact with the mobile edge platform in a manner that supports procedures and life-cycle functions of the application. At the same time, the virtualization infrastructure that it runs on top of executes traffic rules to optimize performance among local applications as well as internal and external networks. This requires core offloading and mobility management technologies that migrate demanding computations from the mobile device that collected it to a nearby infrastructure that is resource-rich and well-suited for data management.
In short, MEC solutions establish a RAN in close proximity to mobile subscribers. This architecture offers lower latency, better proximity, higher bandwidth and improved responsiveness while also enabling a large number of new types of applications. MEC technologies are already feasible and valuable; Proof of Concept (PoC) frameworks are capable of coordinating and promoting multi-vendor projects, and use cases continue to develop into a diverse and functional edge computing ecosystem.
Centralized data management requires end-to-end responsiveness. Mobile devices must send data to the associated cloud, which performs the appropriate analysis and either sends it back to the same device or on to the next location on the network. Mobility-enhanced small-scale cloud Data Centers (DC), or “cloudlets,” may alleviate some of the demands this approach places on the network by establishing a computer or cluster of computers that is sufficiently trustworthy, resource-rich, and well-connected to ensure high network functionality.
Cloudlets serve as intermediaries in the associated edge computing architecture they occupy, as they provide physically proximate representatives of the centralized cloud but with improved robustness and availability. Test cases have shown that cloudlets are capable of decreasing network response time by 51% and reduce energy consumption by another 42%. These experiments have confirmed that cloudlets and Virtual Machine (VM) mechanisms that support them offer a promising solution for latency challenges created by increasingly distributed computing across the IoT.
Regardless of the architecture employed, edge computing technologies all process the data captured at the far ends of a network close to where it is generated rather than transmitting massive amounts of information to a centralized cloud. This approach saves bandwidth and energy resources while also facilitating faster response time and a higher quality of analysis.
As shown in the diagram below, edge computing distinguishes itself from the cloud by introducing localized network storage into the data management process. By doing so, IoT device users are able to transfer data to distributed resources faster while also maintaining communication with the centralized network.
Enterprises are turning to edge computing to improve their network infrastructure and efficiency of IoT device use. By distributing computing resources and applications services along the communication path between the data source in the IoT and the centralized network, companies are seeing better system performance at a reduced operational cost.
Distributed Computing Solutions
Edge computing complements the ongoing development of the IoT by providing a means by which big data can be processed and responded to with greater speed and quality. Because it provides improved efficiency and security for services demanded by the ever-increasing number of end-users engaged through the IoT, edge computing is becoming an increasingly important component of our current network infrastructure.
Solving Latency and Interoperability Challenges on the Edge
In a typical back-end data communication model, data collected by smart objects is exported to a cloud service or some alternative data management architecture. As discussed, this is raising meaningful challenges with respect to both latency and interoperability. To address these challenges, enterprises and industries are investing heavily in edge networks that improve back-end data communication.
The edge is designed to support computing functions distributed across a multitude of devices within a global network, which improves the delivery of network services to remote locations. This distributed computing approach is particularly well-suited for IoT applications, as data is collected from a diverse network of sites in these systems. Rather than sending this data to a centralized location, the edge allows data to be collected and managed within close network proximity. In addition to generally reducing communication bandwidth requirements and enabling faster data transfer, edge computing is opening the door for new computing latency-constrained applications.
Edge computing improves the orchestration of service delivery requirements within a network, which simultaneously addresses the latency and security challenges. With respect to latency, edge computing moves most or all computing functions to the edge of a network rather than straining resources to move all data to a centralized cloud. Many applications used for Augmented Realty (AR) or Virtual Reality (VR), smart cars, telemedicine, Industry 4.0, and smart cities require near real-time network interface. Current latency constraints are holding back the development of these important technologies. Edge computing architectures improve the availability of network resources and reduce latency, which is key to the continued development of latency-sensitive applications.
Cloudlets are particularly well-suited for reducing network latency and addressing potential interoperability, as cloud-edge computing combines the potential functionality of centralized data management with the efficiency of managing data across widely distributed nodes. Distributed computing approaches like this have been particularly useful in compliance applications in which specific industry regulations require the collection and management of multiple streams of data. Furthermore, by offering a consistent operating paradigm across diverse network infrastructures, edge computing approaches are well-suited for network applications that rely on several types of devices or collect various types of data. This offers a degree of consistency and flexibility that poses an elegant solution to the potential interoperability challenges created by the proliferation of IoT devices.
Using the Edge to Improve Network Security
The IoT has created a multitude of new vectors for cyberattack and invasion of privacy, but edge computing makes it possible for networks to maintain security elements at diverse locations across a network. This not only improves the likelihood that an attack will be identified, but also enables higher security performance by increasing the number of network layers a hacker must get through to achieve breach. These security improvements will prove increasingly significant as latency and interoperability challenges are addressed, opening up the market for greater integration of more advanced consumer and enterprise applications.
In order to improve network security in a system that integrates IoT devices, system designers must follow certain best practices. Edge applications should include built-in security protocols at every level of system architecture, and operators must actively manage computing and networking endpoints. Any attack must be immediately isolated and quarantined, and the system must be capable of healing any negative impacts from a breach. By doing so, distributed computing resources can be leveraged to greatly improve security and privacy across networks.
Edge Computing Use Cases
By shifting computing functions from a centralized cloud to devices on networks, edge computing solutions offer improved communication pathways relative to what is currently available through traditional cloud-based or local processing technologies. This allows for increased performance and reduced operational costs across any number of industrial and enterprise applications, as well as more individualized benefits in specific applications like regulatory compliance and data security. Indeed, enterprises have identified more than 200 unique applications of IoT technologies that range from facilitating in-field monitoring to providing advanced solutions to common problems in network functionality.
Some of the brightest minds in tech have been working to develop edge computing uses and applications to address some of their most pressing enterprise challenges. Vodafone, Intel, and Huawei announced a partnership with Carnegie Mellon University in June 2015 intended to accelerate the development of the edge computing system based on cloudlets. The so-called Open Edge Computing (OEC) initiative has helped develop cloudlet-enabled software ecosystems and other important edge computing architectures. Similarly, the OpenFog Consortium – an organization founded by a collaboration of ARM, Cisco, Dell, Intel, Microsoft, and Princeton University in 2015 – has helped to accelerate the adoption of fog computing architectures.
Fortunately for the enterprises that are already dealing with the challenges created by increased reliance on IoT devices across networks, customized and off-the-self distributed solutions are already entering the market. One such product in the field, Dis.co, is already serving customers with distributed computing applications for IoT devices. Like other distributed computing solutions, Dis.co is a platform that uses edge computing resources to improve device-to-device communications and data management by refining the use of computing resources distributed across a network. Dis.co is designed to activate every idle CPU or GPU in the network and ensure it is achieving its best possible use. However, as an added functionality, Dis.co makes it possible for an idle CPU or GPU owner to lease its available computing power to other systems that are in need. This unique approach adds the unique potential for the monetization of network latency that could motivate the more efficient use of bandwidth resources while also leveraging every available CPU or GPU in the system to create a serverless supercomputer.
As industry groups and consortiums are laying the groundwork for the greater deployment of edge computing solutions to address the issues raised by IoT expansion, the edge is being developed primarily for individualized enterprise and industrial applications. Edge computing solutions are already entering the market, and they continue to develop as demand for customized solutions in this space continues to grow.
Unlocking the Full Potential of the Edge
Processing data on the edge of networks where it is collected offers a clear means by which operators will be able to meet the rapidly proliferating and innovative demands that are arising as the IoT grows bigger each day. These demands will continue to shape the future of distributed computing, and they are coming primarily from three fields: virtualization, enterprise and industrial applications, and mobile edge on 5G networks.
Edge computing originated with the virtualization of network services, and early uses of this approach involved leveraging distributed computing power to add versatility to networking. Software Defined Networking (SDN) and similar Network Function Virtualization (NFV) technologies are two developing distributing computing applications that stand to improve network functionality. By decoupling network hardware from control logic, SND supports high network performance without requiring the resource investment of more complex technologies such as MPLS. Likewise, NFV technologies virtualize network functionalities in a manner that increases both efficiency, user control, and automation potential by aggregating the diverse functions of physical assets into a single virtual platform.
Service providers are already offering access to enterprise NFV infrastructure and services that enables complete virtualization of enterprise network services. However, as competitors develop their own edge computing architectures, an increasing number of more refined virtualized network services will emerge.
Enterprise and Industrial Applications
Enterprise and industrial customers need specific edge computing solutions customized to a specific use. A company operating in retail needs a means by which information from inventories, shop floors, vendors, and customers can be managed effectively. Financial organizations and banks forced to grapple with increasing amounts of compliance-related data need a means by which this information can be properly monitored and processed. Any number of industries or enterprises collecting data from remote sensors can benefit from distributed computing solutions that allow operators to manage this data in faster and more value-driven ways.
Enterprise and industry customers are demanding unique solutions to the constraints created by their reliance on the IoT, and edge computing is efficient and flexible enough for deployment across any number of applications. Further, edge computing infrastructure can be coupled with other distributed network technologies to improve flexibility and uniformity of the system while also decreasing hardware requirements and standardizing data management across remote sites.
Edge applications can be customized for enterprise customers to run the same application uniformly across edge network hardware, boosting resiliency in the face of intermittent connectivity. This is particularly important in industries that rely on geographically remote equipment, such as oil rigs, pipelines, mines, distributed power resources, and others by allowing for largely self-contained and autonomous site operations.
The potential applications of edge computing for enterprise and industrial customers is already vast, and it continues to grow as companies encounter unique IoT-driven constraints. Beyond meeting the needs of specific industries, however, distributed computing technologies will find an increasing number of mobile applications with the introduction of 5G.
5G Mobile Edge Computing
Mobile networks are integral to the functionality of the IoT, but they struggle with limited and unpredictable bandwidth. This limitation is preventing the expansion of key enterprise activities such as using AR for remote equipment maintenance and repair or effectively managing data captured through remote IoT devices. In many ways, mobile edge computing is waiting for 5G to have its moment in the sun, and regulators are focusing meaningful resources to defining and developing MEC.
The European Telecommunications Standards Institute (ETSI) has identified 5G as a technology that is integral to future generation networks. To ensure that 5G is properly deployed, ETSI has raised several initiatives to support the development of this critical infrastructure. One of the regulatory body’s major milestones in this regard has been the establishment of the ISG on MEC, which is focused on developing a standardized, open environment for seamless integration of third-party applications that leverage the power of MEC.
The MEC is considered by many to be the most promising solution to handling advanced computing applications like mobile video streaming and smart city management. Currently, video streams from monitoring devices are processed and analyzed at a local MEC server. This local server extracts data that can be transmitted to a distributed computing node to reduce core network traffic. The same can be said for AR/VR mobile applications, which demand real-time data collection in the uplink, computing at the edge, and data delivery in the downlink. MEC is also key to the further development of our smart transportation infrastructure, as smart cities relying on vehicular delay-tolerant networks must leverage MEC to facilitate responsive data management.
Edge computing innovations have been fueled by the growing need for solutions to the constraints the growing IoT is placing on our network infrastructures. At present, most of the devices that make up the IoT – smart phones, wearable devices, and similar consumer technologies – are still fundamentally constrained from a computing perspective. They remain limited in memory, battery life, and the capacity to dissipate heat. The increased deployment of edge computing across IoT devices remains constrained by energy consumption and similar physical constraints. These issues must be addressed for developers to continue to push the boundaries of distributed computing solutions across the IoT.
Like the devices in the IoT, current edge computing architectures also come with their own practical limitations. For example, adding cloudlets to the middle tier of the device-cloudlet-cloud hierarchy improves network response time and reduces latency. Remedying this issue requires a Virtual Machine handoff technology suitable for seamless migration of data from device to cloudlet to cloud, which has yet to be developed.
Despite limitations on IoT hardware and current approaches to distributed computing architecture, distributed computing solutions continue to emerge because organizations are demanding them. Because edge computing solutions can be customized to address potential vulnerabilities in applications on a case-by-case basis, there are countless models for the ways distributed computing can improve network speed and functionality.
At present, the sheer volume of approaches to identifying development models and implementation patterns for edge computing solutions is overwhelming. As the best and brightest minds in tech continue to flesh out an optimal approach to the challenges created by our ever-evolving relationship with technology, industries and enterprises alike will continue to search for functional solutions to growing vulnerabilities the IoT is exposing in traditional cloud-based network infrastructures.
Most recent articles
Dis.co Builds a Serverless Solution for GPU Compute Stacks
Dis.co works hard to stay on the cutting edge of serverless solutions. Our team is constantly building new features and…
How to Accelerate Genomics Research with More Compute Power
The opportunities for solving big health problems through genomics research are tremendous; yet, the challenges associated with storing, managing and…
How to Build a Render Farm (Distributed Rendering) with Dis.co
While the speed and processing power of computers have increased exponentially over the past few decades, it seems that we…
IoT on the Edge: Opportunities and Challenges in Distributed Computing
The Internet of Things (IoT) is getting bigger every day, and as it does devices with computing power are simultaneously…
The Top 5 Tech Trends for 2020
This year, engineers and developers are looking forward to seeing what will come next in any of several exciting advancements…
Packet Customers Discover How to Scale Compute Power at IFX2019
Dis.co participated at Packet’s annual customer event, IFX2019, which was held in December in Las Vegas. Since the conference was…
Ways to Improve Batch Processing and Get Faster Results
When you hear the phrase batch processing what comes to mind? One of your first thoughts could be, “Oh, that’s…
What are the Advantages and Disadvantages of Cloud Computing?
The adoption of cloud computing continues to rise, and that growth is expected to increase into the foreseeable future. Gartner…
Dis.co Makes a Splash at Samsung Developer Conference 2019
The Dis.co team was out in force at this year’s Samsung Developer Conference (SDC 2019) held on October 29 and…
How Does Dis.co Work?
Many compute jobs take hours to complete, especially when data is growing too fast for hardware to keep up. Sharing…
Dis.co’s Vision for a More Powerful Kind of Cloud Computing
Consider the fact that more than 3.3 billion people or about 42% of the world’s population now have smartphones. Then,…
What Technology Tools Can Help Data Scientists?
To say the field of Data Science is exploding is truly an understatement. It only takes one statistic to be…
Serverless Computing vs. Cloud Computing: What’s the Difference?
When IT pros first see or hear the phrase serverless computing, their minds go in multiple directions. Is it true…
What is Serverless Computing?
How many developers would prefer spending more of their time building new products that create value rather than configuring and…
What is DevOps and Why You Should Care
Companies have set the bar high when it comes to their expectations for speed and responsiveness, on-demand availability, easy access…