In other words, it’s not just technology for the sake of technology – it’s about operational excellence.But that is where Dell Technologies comes in — we can help with this transformation from the inside-out. So where does this transformation start?In this series of blogs, we will further explore how Dell Technologies can help Telecom Service Providers meet the challenges of transformation by focusing on people first. This is a journey, much like many others today, but the destination is the Digital Workforce. Having an employee base that thrives within a cloud first model will be the true engine for industry growth. Workforce Transformation is a theme discussed across all organizations. Companies must transform with technology, even if technology is not what they do.Even companies that have consistently demonstrated excellence in delivering technology-based solutions are at-risk, as the underlying architectures that they have built their businesses on change. The competitive pace of change creates internal pressure to adapt systems and processes. This often leads to unintended skill gaps. Many of these organizations feel like they are behind and cannot keep up.Let’s consider Telecommunications Service Providers as one segment representative of this change. They provide the backbone of the Internet, and the critical access and mobile infrastructure so the rest of the world can continue on their transformation journey. All businesses today are built on the Internet and cloud technologies. Industries have embraced this mobility as a critical means of connecting to services, customers and partners. While 3G and 4G/LTE technologies were designed and used as an enabler of high-speed mobile data to the eventual smartphone and tablet, with the advent of 5G, vertical industries will look towards a new age of mobility as a foundation for their future success. To capitalize on this trend organizations must transform operational and organizational models to maximize the full potential of 5G. This is especially true with new opportunities at the edge.Organizations must focus on aligning technology and organizational strategy. This will ensure they not only exist in the next 5 years, but also grow.Dell EMC sees 4 pillars of Telecom Transformation:Network ModernizationIT & BSS/OSS TransformationDigital Growth & TransformationWorkforce InitiativesTo place this into context, it is worth considering what has shaped existing organizations. The scale, composition and structure of telecom organizations represents one of the defining features of the industry. This legacy has been shaped over time by diverse physical network functions, hierarchical systems management, regulatory & compliance restrictions and other industry specific issues. The telecommunications industry has continually endured massive technology shifts and adapted to new business models; however, the rate of change and the pace of disruption only continue to accelerate.Stepping back and examining the larger picture reveals a multitude of technology disruptions taking shape simultaneously within the industry. Network virtualization, OSS & BSS modernization, real-time analytics and advanced telemetry have been underway for some time. To this, planners and strategists must add 5G, other radio technologies (such as WiFi 6 and CBRS), new IoT paradigms and further disaggregation of access and edge networks. Underpinning all these changes are the ever-present currents of openness and open source. Taken together, these present challenges to any organization striving to adapt and reinvent itself.In particular, the widespread belief is that public cloud operating models (massively-scaled within centralized data centers) have solved the challenges facing the Telco Cloud. However, the industry continues to identify requirements at all layers – from facilities to infrastructure to skill sets to processes – that are unique. This learning is important – Public Cloud is not a “lift-and-shift” to Telco Cloud. Public cloud has solved the challenge of deploying tens of thousands of things at single-digit facilities – expanding those to hundreds of things at thousands of disparate facilities is a different problem space. Remote management, automation, orchestration, and operations are unique problems to Telco Cloud.Furthermore, Public Cloud is built on standardization of a single resource building block. Standardized servers are made available in standardized racks, replicated across data center rows. Those rows are replicated across the data center. This homogeneous architecture meets the needs of the majority of tenants. The Telco Cloud, especially closer to the edge, is more heterogeneous, and the difficulty of reaching facilities requires that the right architectures and right capabilities are made available in as few iterations as possible.With this in mind, implementing workforce programs designed to acquire new skills, change the culture and embrace innovation is critical for success. Returning to our themes of transformation, it is worth pointing out that the first 3 pillars all have in common the workforce consideration. This is pervasive throughout the entire company and as such, must be a top priority for the leadership team.For example, traditional job roles may no longer align to business driven technology adoption. The ability to redefine roles and offer training programs designed for these new challenges should be leadership initiated. Today many organizations are focused on career skills that encompass web development, data science and analysis, advanced programming, cloud computing and API design, all within the construct of dev ops and agile methodology.While this may seem at face value to be an internal set of challenges, the reality is that the problem statement can be recast to reflect a rapidly shifting external world that to some extent must be embraced, harnessed and brought within the organization in a meaningful way.Dynamics at play between external and internal forces (see graphic) can be characterized as follows:New technologies, communities and ecosystems are driving an innovation wave throughout the industry.Maximizing this potential requires new models of interacting, adopting and embracing these currents of opportunity.A variety of traditional modes of operation can impede or create pressure on acquiring innovation.An implicit acceptance of mismatched operating models introduces paralysis.
Over the previous decade, technology transformed nearly every business into an IT-driven business. From farming to pharmaceuticals, these information technology developments have led organizations to reimagine how they operate, compete, and serve customers. Data is at the heart of these changes and will continue its transformative trajectory as organizations navigate the waves of technological progress in the next “Data Decade.”In data storage – which touches every IT-driven business – the pace of innovation is accelerating, yet most enterprises continue to struggle with data’s explosive growth and velocity. Getting the highest use and value from their data is becoming ever more critical for organizations, especially for those with data stores reaching exabyte scale.In order to have strategic value in the enterprise, storage innovation must cross the capabilities chasm from just storing and moving around bits to holistic data management.In 2019, our Dell Technologies Storage CTO Council studied more than 90 key technologies and ranked which ones have the innovation potential to help storage cross that capabilities chasm in the next 5-10 years. This year, there are three key areas we believe will be difference-makers for organizations that are pushing the limits of current storage and IT approaches.Let’s take a closer look.Trend #1: Machine learning and CPU Performance unlock new storage and data management approachesThis year, we will see new approaches that solve streaming data challenges, including the use of container-based architectures and software-defined storage. There is a desire by customers in industries such as manufacturing, cybersecurity, autonomous vehicles, public safety and healthcare to build applications that treat data as streams instead of breaking it up into separate files or objects.Ingesting and processing stream data has unique challenges that limit traditional IT and storage systems. Since streaming workloads often change throughout the day – storage capacity and compute power must be elastic to accommodate. This requires intelligence within the storage that can instantly provide autoscaling.By treating everything as a data stream, event data can be replayed in the same way we watch a live sporting event on a DVR-enabled TV, where the program can be paused, rewound and replayed instantly. Until now, application developers have been limited in their ability to address use cases that can leverage data as streams for capture, playback and archive. Enabling these capabilities with data will make it easier to build applications that allow new use cases that were never thought of previously.Dataset Management helps solve the data lifecycle problemIn the realm of data management, 2020 will usher in new approaches for organizations wishing to better manage the data that is distributed across many silos of on-prem and cloud data stores. Data growth has been outstripping the growth of IT budgets for years, making it difficult for organizations not only to keep and store all their data, but manage, monetize, secure and make it useful for end users.Enter Dataset Management – an evolving discipline using various approaches and technologies to help organizations better use and manage data through its lifecycle. At its core, it is about the ability to store data transparently and make it easily discoverable. Our industry has been very good at storing block, file and object data, sometimes unifying these data in a data lake. Dataset Management is the evolution of a data lake, providing customers with the ability to instantly find the data they want and make it actionable in proper context across on-prem and cloud-based data stores.Dataset Management will be especially useful for industries (i.e. media & entertainment, healthcare, insurance) that frequently have data stored across different storage systems and platforms (i.e. device/instrument generated raw data, to derivative data at a project level, etc.). Customers want the ability to search across these data stores to do things such as creating custom workflows. For instance, many of our largest media & entertainment customers are using Dataset Management to connect with asset management databases to tag datasets, which can then be moved to the correct datacenters for things such as special effects work or digital postprocessing, then to distribution and finally to archives.Traditional methods for managing unstructured data only takes you so far. Because of new technological advancements like machine learning and higher CPU performance, we see Dataset Management growing further in prominence in 2020, as it offers organizations a bridge from the old world of directories and files to the new world of data and metadata.Trend #2: Storage will be architected and consumed as Software-definedWe can expect to see new storage designs in 2020 that will further blur the line between storage and compute.Some of our customers tell us they are looking for more flexibility in their traditional SANs, wishing to have compute as close to storage as possible to support data-centric workloads and to reduce operational complexity.With deeper integration of virtualization technologies on the storage array, apps can be run directly on the same system and managed with standard tools. This could be suitable for data-centric applications that require very storage- and data-intensive operations (i.e. analytics apps, intense database apps). Also, workloads that require quick transactional latency and a lot of data.This isn’t HCI in the classic sense, but rather about leveraging and interoperating with existing infrastructure and processes while also giving a greater degree of deployment flexibility to suit the customer’s specific environment and/or application. It could open up new use cases (i.e. AI ML/analytics at edge locations and/or private cloud, workload domains, etc.); it could also lead to lower cost of ownership and simplification for IT teams and application owners that don’t always have to rely on a storage admin to provision or manage the underlying storage.Software-defined Infrastructure no longer just for hyper-scalersSoftware-defined infrastructure (SDI) is also becoming a greater consideration in enterprise data centers to augment traditional SANs and HCI deployments. Long the realm of hyper-scalers, traditional enterprises are ready to adopt SDI for the redeployment of certain workloads that have different requirements for capacity and compute than what traditional 3-layer SANs can provide.These are customers architecting for agility at scale and want the flexibility of rapidly scaling storage and compute independently of each other. It’s for the customer that needs to consolidate multiple high performance (e.g. database) or general workloads. As enterprises consider consolidation strategies, they will bump up against the limits of traditional SANs and the unpredictable performance/costs and lock-in of cloud services. This is where SDI becomes a very viable alternative to traditional SANs and HCI for certain workloads.Trend #3: High-performance Object storage enters the mainstreamAs Object moves from cheap and deep, cold storage or archive to a modern cloud-native storage platform, performance is on many people’s minds.One of the reasons we see this trending upward this year is demand for it by application developers. Analytics is also driving a lot of demand and we expect to see companies in different verticals moving in this direction.In turn, the added performance of flash and NVMe are creating tremendous opportunity for Object-based platforms to support things that require speed and near-limitless scale (i.e. analytics, Advanced Driver Assistance Systems (ADAS), IoT, cloud-native app development, etc.). Side note: historically, Object storage hasn’t been fast enough for ADAS workloads, but all-flash is changing that conversation.Flash-based Object storage with automated tiering to disk offers a cost-effective solution, particularly when a customer is talking about hundreds of petabytes or exabyte-scale. It allows you to move the data you need up to the flash tier to run your analytics and high-performance applications and then move the data off to a cold or archive tier when you’re done with it.As Object becomes tuned for flash and NVMe, we expect a higher level of interest in Object for things that have traditionally been stored on file-based NAS, such as images, log data, and machine generated data.As the pace of technology innovation accelerates, so too will the possibilities in storage and data management. We are standing with our customers at the dawn of the “Data Decade.”If the last ten years brought some of the most dramatic changes in tech, just imagine what’s next.Read what other Dell Technologies experts are saying about key technology trends in 2020 and beyond by clicking on the blog links below:“Technical Disruptions Emerging in 2020,” by John Roese, CTO, Products & Operations, Dell Technologies“Dell EMC’s 2020 Server Trends & Observations,” by Robert Hormuth, Vice President & Fellow, Chief Technology Officer, Server & Infrastructure Systems, Dell EMC“Dell 2020 Networking & Solutions Technology Trends,” by Ihab Tarazi, Chief Technology Officer and Senior Vice President, Networking and Solutions, Dell Technologies