Quantcast
Channel: Blog | Dell
Viewing all 8970 articles
Browse latest View live

The New Magic Box : Three Ways Accelerated Computing is Transforming Enterprises

$
0
0
EMC logo

“Any sufficiently advanced technology is indistinguishable from magic” — Arthur C. Clarke

At this moment in history, we can watch movies on our phones and use our televisions to call our loved ones. Advanced computing is leading to more accurate medical diagnoses, breakthrough medical treatments and a better shopping experience. Businesses, through artificial intelligence, can recognize who their customers are, accurately predict what they are most likely to buy and when they are going to buy it.

Computing acceleration primarily comes from the advances in silicon processing per Moore’s law: the number of transistors incorporated in a chip will approximately double every 24 months, and roughly translates to doubling of the chip’s performance. This has unleashed a level of technology proliferation unprecedented in human history. Digitization has also resulted in the proliferation of data of all kinds — personal, professional and machine data. However, these performance gains from Moore’s law are beginning to plateau. Engineers and scientists are finding that the slow performances offered by contemporary CPUs are either quickly becoming un-economical or insufficient.

Enter the accelerators.

The most common accelerator is the high-speed NVIDIA graphical processing unit (GPU). This specialized hardware is designed to perform one particular task more efficiently than a general-purpose CPU. GPUs have been supporting video games and image rendering for years. The primary computational requirement for video applications is matrix multiplication or, technically speaking, vector processing. The need for a fast rendering of high-resolution images quickly overwhelms a general-purpose CPU. Video game hardware engineers solved this problem with GPUs. They’re so good that the leading-edge GPU from NVIDIA can easily crank out 125 TFLOPS.

Image Source: Nvidia.com

Before long, there was a dramatic acceleration of the performance of critical applications in diverse verticals with similar computational requirements. Verticals from financial services to manufacturing logistics to retail to scientific research to oil and gas exploration now use accelerators to help solve computational problems that they could not before.

Artificial intelligence including machine learning and deep learning has become more mainstream. Accelerated computing is becoming essential to provide the necessary performance to support these business-critical applications. In fact, Google recently stated that it could not run its various services without accelerated computing.

There are many reasons for an enterprise to use accelerated computing but here are the top three:

  • It is BETTER: Enables enterprises to cover workloads more comprehensively by leveraging machine learning / deep learning applications to analyze vast, unstructured data workloads
  • It is FASTER: Get to critical business insights faster. Depending on the application and supporting hardware, accelerated computing can boost performance from 10x to 100x
  • It is COST-EFFECTIVE: A denser but simpler infrastructure for better overall computational performance. It helps lower CapEx and OpEx and helps maintain reliable services. Leverage off-the-shelf accelerators and libraries to effectively support increasingly complex cognitive workloads

Adopting accelerated computing is an easy win for enterprises striving for competitive advantage. Dell EMC has expanded its leadership from HPC into AI. We offer a portfolio of accelerated computing platforms to support our customers’ diverse AI computational needs. Our customers are at various stages in the AI adoption journey and we realize that not all applications need the same category of performance.

Businesses having heterogeneous HPC workloads tend to use our PowerEdge R740XD, our three accelerator-based workhorse platform designed to be more fault tolerant for critical servers in HPC environments. Moving along the increasing computational complexity line, we have the PowerEdge C6420, a server platform with innovative cooling options to provide the maximum performance density for applications such as high frequency trading. Our other platforms include the C6320p, T640 and a few other platforms under investigation.

As machine learning and deep learning applications gain greater adoption, we are very excited to announce the PowerEdge C4140, an ultra-dense, accelerator optimized server platform designed to handle these intensive AI workloads. With its innovative interleaved GPU design, the C4140 can support four GPUs and provide the kind of unthrottled, no-compromise performance that customers have come to expect from Dell EMC PowerEdge servers. The C4140 can now deliver up to 500 TFLOPS for deep learning applications and on a lifesciences application; it is 19x faster than an equivalent CPU-only system. One can also view this superlative performance as needing 19 CPU-only servers to accomplish the same task as one (1) C4140. With NVIDIA’s state-of-the-art GPUs and PowerEdge servers, Dell EMC is helping businesses adopt machine learning and deep learning applications through our various HPC Ready Bundles.

Come and see us at SC 17 (Nov 13 –16) to learn more about the Dell EMC PowerEdge portfolio and some of the hush-hush products we are showcasing in our Whisper Suites. If you cannot make it, be sure to follow us on Twitter and check out Direct2DellEMC for the latest news and updates.

It is also worthwhile to consider that a high-speed interconnect is just as important as the high-speed computations. The next big thing on the scene is Gen-Z, a new data access technology developed by a broad-based industry consortium including Dell EMC. This technology provides the high-speed interconnect needed to allow system disaggregation and the ability to scale acceleration, compute, and memory independently.

The myriad ways that enterprises are taking advantage of this superior performance afforded by accelerated computing and driving new, wondrous applications is nothing short of magic.

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/Magic-Box.jpg


The Power of Technology to Transform Human Progress

$
0
0
EMC logo

Digital transformation is on every CEO’s priority list.  But can emerging digital technologies benefit more than just businesses, to transform nations and drive real socio-economic change?

The answer is YES…and it’s already happening.

Many countries are increasingly using technology to tackle key developmental challenges—from education to healthcare—and human rights issues like human trafficking. It’s no longer just developed, Western nations, who are leading this innovation – a recent study by The Economist shows that emerging markets offer countless examples of technology driving societal change. Examples include: Kiron, an online university designed for global refugees; China’s Baby Come Home app to help parents find their missing children; and BIM, a mobile payment system to empower Peru’s rural populations[i].

The impact of technology on communities is extremely personal for me—the Dell EMC India Center of Excellence (COE) sponsors programs in partnership with government partners to transform rural healthcare across the country. Our foundation is a mobile, cloud and analytics solution which provides a unique, health record for every citizen and connects health workers, doctors and decision-makers in a single, integrated platform.

 We have made a significant impact by leveraging Dell EMC technology and expertise to serve India’s rural healthcare industry (and this is just the beginning…):

  • Women’s Health: The Mahila Master Health Checkup program provides screening for non-communicable diseases in the state of Andhra Pradesh. Launched in Sept 2016, it aims to screen 7 million women by 12,000 health workers and 100+ doctors.
  • Primary Care: The Mee Arogyam program covers basic care for 2.25 million people. It covers non-communicable and communicable diseases along with outpatient department care and provides doctors help in digitizing health records.
  • Comprehensive Primary Health: We partnered with Karuna Trust, a prominent non-profit organization, to digitize the health records of an individual earlier maintained by health workers in physical registers. The care areas include reproductive and child health, maternal health, school health, communicable and non-communicable diseases. We started with a pilot in 2014 covering 23,800 individuals in 1 health center. Today, this partnership covers 25 health centers and impacts 315,000 individuals across 3 states.

The success in India’s rural communities would not be possible without government-driven programs like Digital India and Smart Cities, which formalize and prioritize digitization for the country. Across the world, governments play a critical role in driving scalable technological advancement and transformation. Government policies and frameworks need enthusiastic adoption by the private sector as well. As IDC states, “governments that plan to launch new digital transformation strategies for their countries should align strategic vision, people, process, technology and data elements of their digital agendas”. This is where companies can add tremendous value to ensure strategies are comprehensive and encompass the basics, such as connectivity, internet infrastructure and data and analytics.

As technology advances, governments and the private sector must work together to build the right ecosystem for it to operate within. From broadband highways to affordable smartphones, electricity infrastructure to legal frameworks and implementation strategies, we definitely have our work cut out for us. Exciting times lie ahead for companies, nations and all of humankind.

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/Ultrasound-Image-Doctor-Female-Pregnant-Patient-1000x500.jpg

In the Global Video Security Business, Evolving Business Models Deliver Real Benefits

$
0
0
EMC logo

Do you want to know the secret of a successful business relationship? In my book, it’s trust, complementary skills and mutual need/benefit. Both parties need to have skin in the game, play to their strengths and be aware of each other’s expectations.

The Marketplace Is Challenging

Lofty words but how does this translate to the IT business? As technology is becoming more commoditized, companies are increasingly looking at different ways to do business, provide a better user experience, more responsive services and remain relevant.

No big surprises as to why. It’s challenging out there. Companies have a shorter window to get product to market and gain competitive advantage. Did you know that about half of the companies listed on the Fortune 500 in the year 2000 have subsequently fallen off the list and many of these no longer exist in today’s world? In my view, today’s winners are companies with a long-term view of the market while the losers are companies that stand still and fail to innovate.

Together Is Better

As a result of this market dynamic, I’m seeing customers collaborate with the team at Dell EMC OEM to migrate from a hardware- or software-only model to a broad solutions approach. However, and this point is important to emphasize, it’s not about dabbling in what you don’t know. It’s about continuing to focus on your core strengths but developing the right alliances to add new expertise. As the saying goes, two heads are better than one.

Pelco and Dell EMC OEM Partnership

Take the video surveillance business as a case in point. According to market reports, it’s a thriving business, enjoying annual compound growth of around eight to 10%. While it’s a fragmented market, it’s also a highly specialist niche area with about 10 significant players driving trends. Recognizing that video management and video surveillance systems are complex by nature, Pelco by Schneider Electric, a global leader in video surveillance and security products and technologies, recently launched its VideoXpert Professional VMS, powered by Dell EMC technology.

End-Customer Benefits

From an end-customer viewpoint, the benefits of such a relationship are pretty obvious. Customers tell us that they want easy-to-deploy video surveillance solutions that are smart, scalable and flexible. With the Pelco and Dell EMC OEM collaboration, they get an integrated, tested, validated solution from two best-in-class providers in the industry, backed up by award-winning global support. What’s not to like? Win-win!

Business Benefits for Pelco

However what about Pelco? Apart from customer benefits, are there other pluses? Are we helping Pelco navigate through the market and adapt its business model? If so, how? We talk all the time to our customers, and Pelco recently shared some great insights.

According to Pelco CEO, Jean-Marc Theolier, the collaboration is already delivering real benefits on the ground. “We now have a new, broad channel to market plus the credibility of co-branding with a global leader.”

However, innovation has been the biggest delivery. “Most importantly, the relationship has definitely strengthened our ability to deliver innovation to customers. For example, thanks to Dell EMC’s supply chain expertise, we’ve been able to significantly reduce customer lead-times and introduce our products to market faster, gaining significant advantage over the competition.”

Dell EMC’s global support footprint also gets the thumbs-up. According to Theolier, “We’ve found the whole process of gaining country regulatory certification much faster and easier. The quality of post-sales support is yet another unique selling point, which the competition simply cannot match.”

 Meanwhile, John Roman, global VP of Strategic Accounts at Pelco, admits that he was unaware that Dell EMC had so much existing expertise in video surveillance technology. ”I’ve been pleasantly surprised and impressed by the depth of knowledge, the way the team has collaborated with our subject matter experts plus the great resources available at the Dell EMC OEM video labs. It’s been a real meeting of minds.”

What about the future? As Pelco is already active in adjacent non-surveillance areas like access control and POS, Theolier feels that there is good opportunity to expand market space by harnessing the Internet of Things. “Understandably, there are concerns about uptime, cybersecurity and liability. While we need to adopt a cautious, balanced approach about deploying new technologies, I am excited about the potential of IoT and believe that the benefits far outweigh the risks.”

Business Benefits for Dell EMC OEM

On our side, Pelco has brought huge video security expertise to the table while we are both gaining from best practice sharing in production, procurement and business processes.

From a sales perspective, this relationship is also proving to be a clear winner. Let’s not forget that in today’s world, video surveillance is relevant to so many vertical industries, everyone from retail and manufacturing to airports, healthcare, finance, education, highway monitoring – you name it, they’re on the list.

As IP cameras capture more resolution, there is increasing demand for additional bandwidth, compression, intelligence, a long-term central repository to store the data for the ever-increasing retention time required, monitors to display 4K surveillance cameras in HD resolution, high-powered workstations to show video streams, virtualization to run the applications, high-end compute to run all the video streams, software to analyze the information plus the ability to take that information and derive insights from it. All this plays to our core strengths and capabilities.

For me, Pelco and Dell EMC OEM are bringing it all together in a full stack, soup-to-nuts solution, from camera to recording systems, from client to the network through to compute, storage and into the cloud. It’s this type of collaboration that truly brings added value to both companies and to our joint customers.

Last Words

But, let’s leave the last word to Pelco. Theolier says he is optimistic and excited about the future, “I think the Dell EMC merger is great not only for our partnership but for the broader industry. In our experience, the transition has gone incredibly smoothly with no disruption to customers. I am excited about the capacity of the combined company, the breadth of the product portfolio and the positive impact of this alliance.”

Great to hear as we celebrate our first anniversary as a combined company! I’d love to hear how you are adapting your business model to remain competitive and what we can do to help.

 

Watch a video about the Dell EMC and Pelco partnership here.

 Learn more about Dell EMC OEM

Follow @DellOEM on Twitter, and join our LinkedIn OEM Showcase page.

 

 

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/Security-lock.jpg

Accelerating Into AI with Machine Learning & Deep Learning

$
0
0
EMC logo

I consistently hear from customers that one of their biggest challenges is how to best manage and learn from the ever-increasing amount of data they collect daily. It’s a significant contributor to why the artificial intelligence (AI) market is forecasted to increase from more than $640 million in 2016 to nearly $37 billion in 2025, with AI workloads growing at an estimated annual rate of 52%1. The rapid growth of data and new technology advancements has made it economically viable to adopt machine learning to disrupt new markets, improve operations, and pave a competitive advantage. Working with our strategic technology partners, we’re able to bring these powerful capabilities to organizations of all sizes and industries in more ways than ever before.

At this week’s Supercomputing 2017 conference, we unveiled THREE new solutions that converge our HPC and data analytics expertise along with next generation strategic partner technology and equip organizations to unlock faster, better and deeper data insights. First, what we coin as the ‘bedrock’ of any modern data center, the PowerEdge C4140, an ultra-dense, accelerator optimized server platform designed to handle intensive AI workloads. Coupling the C4140 with our HPC knowhow, we’ve built a new family of Ready Solutions; the Dell EMC Ready Bundles for Machine Learning and Deep Learning enable organizations to realize advancements across a wide array of use cases.

When you consider the applications of machine and deep learning in areas like strengthening security with facial recognition, improving health care, and understanding human behavior in retail, the possibilities are endless and exciting!

Take, for example, these customer stories:

MasterCard leverages artificial intelligence to help protect consumers against credit card fraud. With approximately two million applied rules to automate spend tracking; they handle 160 million transactions per hour and 52 billion per year. Utilizing Dell EMC machine learning technologies, they’ve accelerated the speed with which they can retrieve and validate transaction data, as well as apply new rules to prevent authorized card usage. Of equal importance to stopping unauthorized charges is ensuring that genuine charges are not falsely flagged as fraudulent and prohibited. As their machines become more intelligent, MasterCard is moving closer to a model of complete and proactive oversight, where inaccuracies are prevented before they occur, and customer disruption is minimized.

The Texas Advanced Computing Center (TACC) at The University of Texas at Austin is a leading academic center that uses machine learning for critical scientific discoveries, such as identifying brain tumors, developing cures for cancer, and forecasting severe weather conditions, like tornados. In partnership with Dell EMC, Intel, and Seagate, their “Stampede 2” supercomputer is ranked No. 12 on the latest TOP500 list as one of the most powerful computer systems in the world with its more than 18 petaflop performance. These technologies empower them to support thousands of researchers running simulations too complex for their normal desktop environments.

As another example, Simon Fraser University’s “Cedar” supercomputer is helping researches in Canada to study DNA bacteria patterns for collaborations with worldwide public health agencies.  Unlike the many HPC systems built for narrowly targeted applications, Cedar is designed to run a wide variety of scientific workloads, including those related to personalized medicine, green energy technology, AI and the AI sub-fields of machine learning and deep learning. Simon Fraser’s research findings are leading to better and faster infectious disease control measures, like new vaccination programs, to help keep humankind safe.

We’re at the forefront of many incredible breakthroughs in artificial intelligence, stemming from the mastery of machine and deep learning, and only scratching the surface of what’s possible.  We told you in October that we’re committed to investments in this space and making AI a reality for all customers. We’re making excellent headway as a company and have already made several exciting announcements, including accelerated computing platforms to support our customers’ unique AI performance needs.

Working with the best and brightest minds to better the world is what energizes me and our Dell EMC teams every day to propel our HPC mission towards artificial intelligence-based customer and industry outcomes. Stay tuned for even more announcements and customer successes in this space, as together we advance technology to drive human progress.

1 Tractica, Artificial Intelligence Revenue to Reach $36.8 Billion Worldwide by 2025, Aug 2016

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/Machine-Learning-1000x500.jpg

Technology Helps Protect Society and Inform Urban Planning

$
0
0
EMC logo

Technology is no longer just a business tool – it is also helping to solve social issues. Take the question of personal and public security, which is a growing concern in today’s world.

 For example, as a parent, have you ever had the awful experience of your two-year old, wandering off in a busy shopping mall? One minute, they are beside you. You turn your head for literally a moment and when you look back, your son or daughter seems to have vanished into thin air. The chances are that the child has just wandered off innocently and there is no abduction involved but the panic of that moment stops you straight in your tracks and you are sick with worry until your child is safely located.

At the Airport

Picture a busy airport, milling with people.  A bag – abandoned in the check-in area – has been designated as a potential security threat. It may be an innocent mistake on the part of a distracted passenger or it could represent a terrorist attack and present a risk to everyone at the airport? What does security do? How do they quickly identify the owner?

Of course, nothing can ever replace the importance of traditional policing, smart intelligence, surveillance and the presence of police on the ground but the Internet of Things, coupled with secure CCTV technology, is certainly putting real-time data at the finger-tips of both police and security personnel.

In the Shopping Mall

Take for example, the case of the missing toddler. Imagine the security guard, using his/her smartphone – loaded with special software – to photograph the parent for immediate upload into the shopping centre’s facial recognition system. Armed with this image, the system instantly searches the footage from that day and identifies when the parent first arrived at the shopping centre with the child.

Having extracted this footage, Security can then enroll the child’s face into the online facial recognition system. This automatically searches for the missing child across all the CCTV cameras in the network, tracking the movement of the child in real time – where they have been and where they are right now. Based on the GPS coordinates, the guard closest to the child is automatically alerted and the family is quickly reunited. This whole process – from start to finish – takes minutes, helping to quickly resolve a very traumatic experience for both parent and child.

 Real-Time Tracking

Let’s switch to the scenario in the airport. The IoT-based CCTV system quickly locates the abandoned bag– even if partially obscured. It then jumps back to the relevant footage and enrols the face of the person who has left the bag there. This image is transmitted to all cameras in the network and the person’s location is automatically tracked in real-time. An urgent alert – complete with a photograph of the person and details of the incident – is automatically sent to the nearest Security guard for action.  The likelihood is that the episode was simply an innocent mistake but IoT-enabled personal devices with face recognition technologies, connected to a database of criminals, can proactively warn Police when convicted offenders are in the vicinity.

Other Developments

In other security developments, the New York City Police Department has tested acoustic sensors, which can detect illegal gunshots to provide real-time alerts to police in busy precincts. Many police officers now wear body cams on the beat with studies indicating that they improve self-awareness and help promote the right behaviour from both the police and those they interact with.

The bottom line is that police agencies across the world are moving toward more data-driven approaches to solving crimes. Machine learning is particularly good at identifying patterns and can be useful when trying to discern a modus operandi of an offender, particularly in the case of serial crime.

Supporting Urban Planning

Let’s switch to a more benign setting. Maybe you work in the local planning authority. How do you make public spaces in the city work better for citizens? What is the air pollution level like at any given moment in time? What streets in the city centre attract the most foot fall?  What is the percentage of car users versus pedestrians and cyclists?

Data Is the Answer

Thanks to the use of sensors, IoT CCTV and analytics, planners can now better understand foot fall patterns – how many people are going where, how and when. It’s important to say that in this instance, people are not individually identified – rather, the planners are looking at aggregated data to help determine infrastructural requirements, like the number of required cycle ways, car lanes, footpaths, parks and bins.

There are other potential benefits. For example, business-people looking to open a new shop could potentially be given accurate figures for foot fall near their proposed location to help them assess the potential for their new venture.

Smart parking can also use sensors and devices to help drivers quickly locate parking spaces and reduce congestion and fuel emissions. There are also obvious public security benefits. Apart from detecting and preventing vandalism and crime in real-time, in the event of an accident or say an elderly person falling, the emergency services can be automatically summonsed to the scene.

Smart Partnerships

So, what role does Dell EMC OEM play in all of this? The answer is simple. We collaborate with specialist video surveillance and security partners, like iOmniscinet, Milestone, V-5 Systems and Pelco to power their solutions. Our partners provide the IP while we provide the customised hardware platform, and support services.

Of course, it goes without saying that criminals and hackers will try to exploit any vulnerability they can find in new security systems.  All these interconnected networks and devices need the right levels of security, built in from the start to protect both the cities and their citizens. That is where we can also add value. We have a dedicated focus on surveillance with experts available in our IoT lab to collaborate with specialist video surveillance and security partners.

As a society, I believe that we need to continue to respect the importance of individual privacy while carefully balancing this against the need to protect the common good.

What are your views on technology being used to improve security and urban planning? I would love to hear your comments and questions.

 

 

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/Young-Child-1000x500.jpg

[The Source Podcast] Pizza, Elevators, Data Analytics and Business Opportunity

$
0
0
EMC logo

It all starts with data analytics.  The practice of applying modern analytics software tools across data of all types, including unstructured, semi-structured, and structured data; as well as real-time/streaming and batch.  Discovering insights to enhance the understanding of business and customer behavior is the primary goal.  These analytics-driven insights can be used to shape business outcomes, improve competitive advantage, enhance financial decisions and develop more concise projections.

I sat down with Erin Banks (@BanksEK) aka #BigDataBanks at the Dell EMC Forum Montreal to get the latest.  From pizza to elevators to Mexican food and grocery stores, Big Data is nothing without Big Data Analytics.  We have the details this week.   For more information visit: dellemc.com/bigdata or Email: data_analytics@dell.com.

Get The Source app in the Apple App Store or Google Play, and Subscribe to the podcast: iTunes, Stitcher Radio or Google Play.

Dell EMC The Source Podcast is hosted by Sam Marraccini (@SamMarraccini)

 

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/Dell-EMC-The-Source-110-Pizza-Elevators-and-Big-Data.jpg

Megaplus Helps Enable the Next Generation with Dell Vostro

$
0
0
EMC logo

Many Dell EMC partners are winning big with the combination of the Dell Technologies portfolio of products and solutions and the local market expertise they bring. A great example of this is Megaplus.

In Megaplus’ home market of Pakistan, they have worked closely with the Punjab Higher Education Department (HED) to enable the next generation to compete globally—with the help of Dell Vostro, supported by Dell EMC PowerEdge servers.

In a deal worth $70m, the Punjab HED sought to reduce the digital divide for their students and enable them to compete on a global scale. They created an innovative program to provide 115,000 laptops to merit students. As a direct result of this program, the Punjab HED has seen a growing number of entrepreneurs who in turn generate a higher income.

Syed Raza Ali Gilliani, Minister at the Punjab HED, shared:

We wanted to partner with global technology vendors who share the same vision as us, that’s why we work with Megaplus and Dell. We’re partnering with the best of the best.

Partner with Dell EMC to win big with digital transformation.

Learn more about Dell Client Solutions and watch the Megaplus case study below.

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/Megaplus-Photo_1000x500.png

Dell EMC Forum: Realize Your Digital Future

$
0
0
EMC logo

Technology is advancing at an exponential rate, changing how we live and work. We’re seeing customers, in every industry, fundamentally rethink their business models. IT is at the heart of this transformation and key to competitive advantage.

Start having these conversations today. Join us at a Dell EMC Forum near you to discover new ways to use technology to better serve your customers. We’ve had over 30,000 attendees in over 60 cities around the world. The good news is that there are still a dozen events happening worldwide through 2017.

In New York, Michael Dell talked about the approach Dell Technologies is taking to deliver a unified IoT strategy to customers. We also heard from AeroFarms, who is redefining agriculture by setting new standards for product quality and production.

In Milan, I heard inspiring stories from customers who increased company efficiency and productivity by modernizing their IT infrastructure and workforce tools. There were also stories of companies putting data to work providing new and valuable business sights.

There has never been a more important time to rethink what’s possible and accelerate your transformational journey. So, join us and let’s make your digital future a reality.

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/Dell-EMC-Forum-1000x500.png


CPG Industry Levels Playing Field with Power of One

$
0
0
EMC logo

Special thanks to Brandon Kaier (@bkaier) for his research and thoughts on the Digital Twins concept.

Unilever, one of the Consumer Package Goods (CPG) industry’s titans with over 400 brands and annual sales greater than $60B, recently bought Dollar Shave Club for $1B. Now normally I would not think twice about such an acquisition, peanuts in the world of mergers and acquisitions.

However, this one feels different.

Two billion people use Unilever products every day according to Unilever’s 2015 annual report. Dollar Shave Club only has around two million members; the vast majority of who are likely already Unilever customers. So I don’t think Unilever bought Dollar Shave Club for their customer base.

The Harvard Business Review speculates in 5 Ways to Help Employees Keep Up with Digital Transformation that “Unilever has acquired Dollar Shave Club, a young startup, for $1 billion in a move to introduce a new model of subscription sales.”

It seems that Unilever could have easily created their own subscription model without having to pay $1B for customers with whom they already have a relationship. So I don’t believe that Unilever just bought a subscription model. Instead, I think Unilever bought a capability; a capability to capture and mine individual customer product purchase behaviors – the frequency, recency, intensity, magnitude and monetary value of purchase behaviors at the level of the individual consumer – and to eventually apply this analytic capability across more of their brands.

Think about the purchase behavior details Dollar Shave Club has on each of its individual subscribers. Unilever has no similar behavioral knowledge or insights at the level of the individual consumer; they only know how much product they push through retailers and distributors like Walmart, Kroger and Target.

To be actionable, Big Data must get down to the level of the individual – the “Power of One.” Big Data enables capturing customers’ individual tendencies, propensities, behaviors, patterns, associations, and relationships in order to monetize the resulting customer, product and operational insights (see Figure 1).

Figure 1: "Power of One" to Understand and Monetize Individual Customer Insights

Figure 1: “Power of One” to Understand and Monetize Individual Customer Insights

The Power of Digital Twins

Digital Twins is a concept that exploits the “Power of One.” Picked by Gartner (“Gartner Top 10 Strategic Technology Trends for 2018”) as one of the top 10 strategic technology trends in 2018, Digital Twins couples virtual and physical worlds to facilitate analysis of data and monitoring of systems in order to avert problems, prevent downtime, develop new opportunities and support planning via simulations[1].

But the Digital Twin concept isn’t new. The concept of a digital twin was originally developed by NASA in order to help manage unexpected “situations” that might occur during space travel (remember Gary Sinise in the movie “Apollo 13”).

NASA grappled with the challenge of designing things that travel so far away, beyond the ability to immediately see, monitor or modify. NASA’s innovation was a Digital Twin of the physical system, a complete digital model that can be used to operate, simulate and analyze an underlying system governed by physics[2].

This Digital Twin concept is being embraced throughout the Industrial Internet of Things (IIOT) world. GE may be the most famous of those IIOT companies in adopting this concept, “Digital Twin at Work: The Technology That’s Changing Industry.”

To quote GE:

Digital twin eliminates guesswork from determining the best course of action to service critical physical assets, from engines to power turbines. Moving forward, easy access to this unique combination of deep knowledge and intelligence about your assets paves the road to optimization and business transformation.

But Digital Twins isn’t just relevant to IOT. The Digital Twins concept, when instantiated via Analytic Profiles, plays a major role in understanding and monetizing human behaviors as well.

Analytic Profiles: Simplifying Digital Twins Concept

I blog and teach frequently on how organizations can embrace Analytic Profiles as a mechanism to help organizations capture, refine and share analytic insights at the level of individual humans and things.

Analytic Profiles provide a storage model (think key-value store) for capturing the organization’s analytic assets in a way that facilities the refinement and sharing of those analytic assets across multiple business use cases. An Analytic Profile consists of metrics, predictive indicators, segments, scores, and business rules that codify the behaviors, preferences, propensities, inclinations, tendencies, interests, associations and affiliations for the organization’s key business entities such as customers, patients, students, athletes, jet engines, cars, locomotives, medical devices, and wind turbines (see Figure 2).

Figure 2: Customer Analytic Profile

Figure 2: Customer Analytic Profile

 

Analytic Profiles provide an operational framework for capturing, refining and sharing the organization’s analytic assets. For example, Analytic Profiles provide the foundation for clustering customers into similar behavioral segments, creating detailed behavioral and usage profiles based upon purchase behaviors, and calculating the current and predicted customer lifetime value (see Figure ).

Figure 3: Leveraging Analytic Profiles to Determine Predicted Customer LTV

Figure 3: Leveraging Analytic Profiles to Determine Predicted Customer LTV

See the blog “Analytic Profiles: Key to Data Monetization” for more details on Analytic Profiles.

Without the analytic insights captured, refined and shared within Analytic Profiles, you lack the customer, product, service, operational and market insights that are powering new trends, such as those in Figure 4 below.

Figure 4: Gartner's Top 10 Strategic Technology Trends for 2018

Figure 4: Gartner’s Top 10 Strategic Technology Trends for 2018

CPG Firms: Leveling the Playing Field

A company called MoviePass (“Why MoviePass’s Crazy Cheap Subscription Just Might Work”) is promoting what appears to be a totally unsustainable subscription business model – pay $9.95 per month to see any movie in a movie theater that you want, AND the movie theater is reimbursed full price for the ticket. On the surface, that doesn’t seem to make any financial sense. However, the customer and movie insights that MoviePass is gaining about the behaviors, tendencies and inclinations of movie goers and the movies that they watch is likely to open all sorts of new monetization opportunities for MoviePass that can help filmmakers, producers, and studios turn a profit in areas such as movie planning, budgeting, development, customer profiling, customer targeting, promotion, advertising, merchandising, foreign sales, and DVD/TV/Video on Demand streaming rights.

This shift towards subscription business models could give CPGs an opportunity to level the playing field with retailers who have detailed customer transactional data (courtesy of their Point of Sales system and customer loyalty program).  These subscription business model, coupled with analytic profiles, provides an opportunity for CPG firms to gain rich insight into the behaviors of individual customers that can drive research, product development, marketing, advertising, sales and customer service.

With this detailed consumer insights, CPG companies could now start operating more like Netflix in their ability to monetize their customers’ purchase and behavioral insights (see the blog “Netflix Intelligent”: Something Every Company Can Do!).

[1] “What Is Digital Twin Technology – And Why Is It So Important?” by Bernard Marr

[2] “The Rise of Digital Twins”

The post CPG Industry Levels Playing Field with Power of One appeared first on InFocus Blog | Dell EMC Services.

Want to Stay Fit and Relevant in the Digital Age? Go Cloud Native.

$
0
0
EMC logo

Contemplating the year 2030, just a dozen or so years from now, can be head-spinning. “Realizing 2030: The Next Era of Human-Machine Partnership” makes for interesting reading on how rapidly evolving and converging technologies may impact individuals and organizations.

One striking image evoked by the paper is that of humans acting like “digital resource conductors.” In my mind’s eye I see each of us with our conductor’s baton, directing holograms of useful services in 3D virtual reality to “orchestrate, manage, and automate many day-to-day activities.”

Los Angeles-based DAQRI is using AR devices to display information and work instructions over a worker’s environment, enabling them to complete a task safely and efficiently.

I like this image because I prefer a future in which humans are the “actors” rather than the “acted upon.” And I believe that by thinking and acting wisely, we can do more than position ourselves for the future, we can help to create the kind of future we want.

Fit and Relevant in the Digital Age

Recently, a large customer in APJ complained that everybody understands the threats of digital transformation—intensifying competition, disruptive technology and business models, relentless, rapid change—what’s not understood is how to “stay fit and relevant in the digital age.”

As the Realizing 2030 paper reports, a Dell Digital Transformation Index study conducted with 4,000 senior decision makers from across the world found 45 percent concerned about becoming obsolete in 3-5 years. Nearly half said they don’t know what their industry will look like in just three years’ time, and 73 percent believe they need to be more ‘digital’ to succeed in the future.

“How to stay fit and relevant in the digital age” seems a very good way of putting the challenge these leaders were describing. The phrase came back to me later, when meeting with the applications development team at a large telecommunications company in APJ. The team was laser-focused on creating a differentiating “cloud native application environment of the future.” They were applying cloud native architecture and the 12-factor app methodology for modernizing applications—and adopting a micro services architecture leveraging Pivotal Cloud Foundry and Spring and other micro service technologies on AWS for agile new cloud native application development.

Go Native

Reflecting later on how the telecom applications development team was moving ahead to create their digital future, it seemed to me that some very good advice for “staying fit and relevant in the digital age” would be to: “Go native.”

Traditionally, “go native” means to adopt the customs and way of life of the country or region where one happens to be. What I’m suggesting is that, in recognition of where we stand today, we “go digital native” and “go cloud native.”

Digital Native

No matter what industry you are in, the digital native generation (and their descendants) make up a rapidly growing proportion of your customers.

Oculus VR with Dell Precision Workstation
Students use Virtual Reality for an immersive and educational
experience.

Which means that even those of us born too soon should work to understand and begin to think like the generation that grew up in the digital world.

One thing we know about digital natives is that they are already very comfortable ‘conducting’ their lives online. Indeed, within the next three years, it’s estimated that 50 percent of the products and services that all businesses sell will be digitally enhanced.

Another thing we know about digital natives is that they are very, very quick to change their brand loyalty and buying behavior. To win and retain digital natives, you need to deliver value, speed, convenience, and innovation—not just once, but frequently and consistently.

Cloud Native

That’s why, my second piece of advice is to “go cloud native”—and as quickly as possible.

Without a cloud native application environment and agile development, you simply can’t innovate and deliver fast enough—and cost-effectively enough—to stay fit and relevant in today’s digital marketplace.

Simply put, digital innovation is the new value creation—and cloud is the way that value is delivered. For example, a leading luxury auto firm we work with here in EMEA no longer refers to themselves as an “auto manufacturer,” but as an “automotive technology company.” Other customers are similarly working to transform and re-position themselves in the marketplace as “digital technology companies.”

As Realizing 2030 puts it: “Increasing innovation in cloud-native apps and their propensity to be built and deployed in quick cadence to offer greater agility, resilience, and portability across clouds will drive further uptake. Start-ups are starting to use cloud-native approaches to disrupt traditional industries; and by 2030, cloud technologies will be embedded.”

Figure 1: Chitale Dairy launched the ‘cow to cloud’ initiative to improve the health
and well-being of cows on dairy farms in India.

For a beautiful example, take a look at how Chitale Dairy improves the economic well-being of dairy farmers in India with their ‘cow to cloud’ initiative!

I predict that within three years, 75 percent of IT spend will be driven by cloud native applications. To be able to deliver the world-class experience that digital customers demand will take the right cloud infrastructure capabilities, the right open native cloud software platform environment and the right agile and DevOps processes.

What steps are you taking to remain fit and relevant in the digital age?

The post Want to Stay Fit and Relevant in the Digital Age? Go Cloud Native. appeared first on InFocus Blog | Dell EMC Services.

RSA Charge 2017 Best in Show Award Winners

$
0
0
EMC logo

At this year's RSA Charge, it was amazing to me to see so many Compliance, Risk and Security professionals in one place, learning from subject matter experts and each other through technical deep dives and business-driven use cases focused on delivering best practice and lessons learned.  I had the opportunity to speak with so many RSA customers and was inspired by the great work they are doing.    

 

One of the highlights of the event was that over 100 RSA customers got up on stage during RSA Charge to present their unique use case and the challenges and opportunities they have addressed with the help of RSA solutions.  Thank you for sending us your feedback; it is great to see that overall you felt that the sessions were impactful and of value. 

 

During RSA Charge you completed evaluations for the sessions that you attended.  These provide us great information, including what sessions you enjoyed the most – you confirmed that one presentation from each RSA Suite clearly stood out as being the BEST! 

 

Out of 92 outstanding Breakout sessions that took place on Wednesday, October 17 and Thursday, October 18 winners were selected by RSA Charge 2017 attendees for being best overall in:

 

  • Overall Value
  • Presentation Skills
  • Credibility/Knowledge
  • Engaging/Interactive
  • Avoided Commercialization
  • Relevance

 

We would like to announce, recognize and sincerely thank the recipients of the RSA CHARGE 2017 Best in Show Award:

 

            RSA Archer Suite Best in Show Award:

Deanne Dinslage, Sr. Archer Systems Administrator, Assistant Vice President, Bank of the West & Andrea Dollen, Manager, True8 Solutions            

Beyond the Customer - Making RSA Archer Suite Work for YOU! - Tired of hours of documentation for minutes of build?  Let me show you how to use RSA Archer Suite to do this in a few clicks with better results!

 

RSA Fraud & Risk Intelligence Suite Best in Show Award:

Damon Marracini, Vice President, Citi; Michael O’Connor, eCommerce Principal Product Marketing Manager, RSA; Greg Zaharchuk, Fraud Investigator, Vanguard; Qasim Zaidi, Cyber Process Manager, Capital One; Alma Zohar, Web Threat Detection Product Manager, RSA

Tales from the Trenches: Using Web Threat Detection to Fight Fraud - Learn how RSA Web Threat Detection is helping customers fight real-world cyber fraud.

 

RSA NetWitness Suite Best in Show Award:

Sean Catlett, SVP, Emerging Services, Optiv

Building a Modern Security Program:  Or… “If I Had to Start Over, What Would I Do?” – Discussion on keys to building your SOC and defending your enterprise using orchestration and automation.

 

RSA SecurID Suite Best in Show Award:

Michael Duncan, Program/Process Manager, Ameritas Life Insurance Corp; Lisa Ferraro, Developer, Ameritas Life Insurance Corp; Ravi Makam, Principal Consultant, Optiv

Insights and Lessons Learned from Upgrading RSA Identity Governance and Lifecycle and Going Virtual - Ameritas Life Insurance Corporation and Optiv Discuss Upgrading to RSA Identity Governance and Lifecycle Version 7.0.1 and go from a hard appliance to VM's to take advantage of new product capabilities.

  

Congratulations to all the Best in Show Award winners – RSA Charge 2017 attendees selected these from over 92 sessions!  Great job and thank you!

Server Disaggregation: Sometimes the Sum of the Parts Is Greater Than the Whole

$
0
0
EMC logo

The notion of “the whole being greater than the sum of its parts” is true for many implementations of technology. Take, for example, hyper-converged infrastructure (HCI) solutions like the Dell EMC VxRail. HCI combines virtualization software and software defined storage with industry standard servers. It ties these components together with orchestration and infrastructure management software to deliver a combined solution that provides operational and deployment efficiencies that, for many classes of users, would not be possible if the components were delivered separately.

However, certain challenges require separating out the parts – that’s where the solution is found. And, that is true in the case of Server Disaggregation and the potential benefits such an architecture can provide.

So, what is Server Disaggregation? It’s the idea that for data centers of a certain size, efficiencies of servers can be improved by dissecting the traditional servers’ components and grouping like components into resource pools. Once pooled, a physical server can be aggregated (i.e., built) by drawing resources on the fly, optimally sized for the application it will run. The benefits of this model are best described by examining a little history.

B.V.E. (Before the Virtualization Era)

Before virtualization became prevalent, enterprise applications were typically assigned to physical servers in a one-to-one mapping. To prevent unexpected interactions between the programs, such as one misbehaving program consuming all the bandwidth of a server component and starving the other programs, it was common to give critical enterprise applications their own dedicated server hardware.

Figure 1 describes this model. Figure 1 (a) illustrates a concept physical server with its resources separated by class type: CPU, SCM[1], GPU and FPGA, Network, Storage. Figure 1 (b) shows a hypothetical application deployed on the server and shows the portion of the resources the application consumed. Figure 1 (c) calls out the portion of the server’s resources that were underutilized by the application.

Figure 1 (c) highlights the problem with this model, overprovisioning.  The underutilized resources were the result of overprovisioning of the server hardware for the application to be run. Servers were overprovisioned for a variety of reasons including lack of knowledge of the application’s resource needs, fear of possible dynamic changes in workload, and to account for anticipated application or dataset growth overtime. Overprovisioning was the result of a “better safe than sorry” mindset, which was not necessarily bad philosophy when dealing with mission critical enterprise applications. However, this model had its costs (e.g., higher acquisition costs, greater power consumption, etc.). Also, because the sizing of multiple servers for applications was done when the servers were acquired, a certain amount of configuration agility was removed as more knowledge about the true resource needs of the applications was learned. Before virtualization, data center server utilizations could be as low as 15% or less.

Figure 1: Enterprise Application Deployment before Virtualization

The Virtualization Age

When virtualization first started to appear in data centers, one of its biggest value propositions was to increase server utilizations. (Although, many people would say, and I would agree, that equally important are the operational features that virtualization environments like VMware vSphere provide. Features like live-migration, snapshots and rapid deployment of applications, to name a few.) Figure 2 shows how hypervisors increased server utilizations by allowing multiple enterprise applications to share the same physical server hardware. After virtualization was introduced to the data center server utilizations could climb to 50% to 70%.

Figure 2: Enterprise Application Deployment after Virtualization

Disaggregation: A Server Evolution under Development

While the improvement of utilization brought by virtualization is impressive, the amount of unutilized or underutilized resources trapped on each server starts to add up quickly. In a virtual server farm, the data center could have the equivalent of one idle server for every one to three servers deployed.

The goals of Server Disaggregation are to further improve the utilization of data center server resources and to add to operational efficiency and agility. Figure 3 illustrates the Server Disaggregation concept. In the fully disaggregated server model, resources typically found in servers are grouped together into common resource pools. The pools are connected by one or more high-speed, high-bandwidth, low latency fabrics. A software entity, called the Server Builder in this example, is responsible for managing the pooled resources and rack scale fabric.

When an administrator or a higher-level orchestration engine needs a server for a specific application, it sends a request to the Server Builder with the characteristics of the needed server (e.g., CPU, DRAM, persistent memory (SCM), network, and storage requirements). The Server Builder draws the necessary resources from the resource pools and configures the rack scale fabric to connect the resources together. The result is a disaggregated server as shown in Figure 3 (a), a full bare-metal, bootable server ready for the installation of an operating system, hypervisor and/or application.

The process can be repeated if the required unassigned resources remain in the pools, allowing new servers to be created and customized to the application to be installed. From the OS, hypervisor or application point of view, the disaggregated server is undistinguishable from a traditional server, although with several added benefits that will be described in the next section. In this sense, disaggregation is an evolution of server architecture, not a revolution as it does not require a refactoring of the existing software ecosystem.

Figure 3: Disaggregated Servers

The Benefits of Being Apart

While having all the capabilities of a traditional server, the disaggregated server has many benefits:

  • Configuration Optimization: The Server Builder can deliver a disaggregated server specifically composed of the resources a given application requires.
  • Liberation of Unused Resources: Unused resources are no longer trapped within the traditional server chassis. These resources are now available to all disaggregated servers for capability expansion or to be used for the creation of additional servers (see Figure 3 (b)).
  • Less Need to Overprovision: Because resources can be dynamically and independently added to a disaggregated server, there will be less temptation to use a larger than needed server during initial deployment. Also, since unused resources are available to all existing and future configurations, spare capacity can be managed from a data center level instead of a per server level, enabling a smaller amount of reserved resources to provide the overflow capacity to more servers.
  • Independent Acquisition of Resources: Resources can be purchased independently and added separately to their respective pools.
  • Increased RAS (Reliability, Availability and Serviceability): High-availability can be added to server resources where it was not possible or economical to do so before. For example, the rack scale fabric can be designed to add redundant paths to resources. Also, when a CPU resource fails, the other resources can be remapped to a new CPU resource and the disaggregated server rebooted.
  • Increased Agility through Repurposing: When a disaggregated server is retired, its resources return to the pool which in turn can be reused in new disaggregated servers. Also, as application loads change, disaggregated servers devoted to one application cluster can be reformed and dedicated to another application cluster with different resource requirements

The above list is not exhaustive and many other benefits of this architecture exist.

The Challenges (and Opportunities) of a Long(ish)-Distant Relationship

Full server disaggregation is not here yet and the concept is under development. For it to be possible, an extremely low-latency fabric is required to allow the components to be separated at the rack level. The fabric also needs to support memory semantics to be able to disaggregate SCM (Storage Class Memory). It remains to be seen if all DRAM can be disaggregated from the CPU, but I believe that large portions can depending on the requirements of the different classes of data used by an application. Fortunately, the industry is already developing an open standard for a fabric which is perfect for full disaggregation, Gen-Z. Information about the Gen-Z effort can be found at www.genzconsortium.org.

The software that controls resources and configures disaggregated servers, the Server Builder, needs to be developed. It also provides opportunities for the addition of monitoring and metric collection that can be used to dynamically manage resources in ways that were not possible with the traditional server model.

Another opportunity is the tying together of the disaggregated server infrastructure with the existing orchestration ecosystems. Server Disaggregation is in no way a competitor to existing orchestration architectures like virtualization. On the contrary, Server Disaggregation is enhancing the traditional server architecture that these orchestration environments already use.

One can imagine that the management utilities administrators use to control their orchestration environments could be augmented to communicate directly to the Server Builder to create the servers they need.  The administrator may not ever need to interface directly to the Server Builder. The benefits of disaggregation should be additive to the benefits of the orchestration environments.

Conclusion: An Exciting Time in Server Architecture

It is an exciting time to be involved in server architecture. New technologies like SCM and rack scale, low-latency fabrics are opening new doors for server innovation. Server Disaggregation has the potential to be one of these important innovations. Indeed, we have already seen some of the benefits of the disaggregation of some of the server components in systems like the Dell EMC PowerEdge FX2 and Dell EMC PowerEdge VRTX. Server Disaggregation can build on the benefits these examples provide and lead to a more efficient and more dynamic server infrastructure environment.

[1] SCM – Storage Class Memory. A class of emerging persistent memory technologies with latencies lower than NAND flash.

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/Server-Network-1000x500.jpg

Envisioning the Future with Titanium Black Partners

$
0
0
EMC logo

The IT industry is one of constant change, perhaps never greater than right now – from sweeping consolidation, to revolutionizing IT infrastructure. Transformation is happening all around us – at Dell EMC, we talk about the four critical transformations we believe every organization must embrace to remain competitive.

Being a viable business today means having a strong point-of-view of where the world is going tomorrow. Earlier this month, during our Titanium Black “Access to the Future” experience, we shared with these select, elite Partners Dell EMC’s vision for the future, how we’re thinking about 2030 – from our technology roadmap, to our go-to-market plans. Titanium Black Partners are innovating in incredible ways, going big and winning big with Dell EMC’s best-in-class portfolio, and architecting real and meaningful change for our end users. They understand the four transformations as essential, and are helping their customers prepare for tomorrow… today.

Hearing and engaging with Dell leadership, including Michael Dell, Jeff Clarke, Tom Sweet, Howard Elias, Rory Read, Jeremy Burton, Marius Haas, Bill Scannell and John Roese, we discussed the decisions and influences shaping how Dell EMC will remain competitive and win, and how our Partners can align and embed with us.

As we stand shoulder-to-shoulder with our Partners, we face an exciting future. One that promises continued evolution and even revolution. One that holds tremendous opportunities. Titanium Black Partners told me throughout “Access to the Future” that they are on board with us 100%. They told me they are excited about and confident in our vision and strategy. Most importantly, they told me they can see and feel how Dell EMC and our Partners will lead and continue to grow… together.

Global Channels is a powerful force in Dell EMC’s success, with our incredible Titanium Black Partners leading the way. No one at “Access to the Future” is settling for ordinary. We are charging full force towards Extraordinary.

ENCLOSURE:https://blog.dellemc.com/uploads/2017/11/ATF_Image_1000x500.jpg

Democratizing Artificial Intelligence, Deep Learning and Machine Learning with Dell EMC Ready Solutions

$
0
0
EMC logo

Artificial Intelligence (AI), Machine Learning (ML) and Deep Learning (DL) are at the heart of digital transformation by enabling organizations to exploit their growing wealth of big data to optimize key business and operational use cases.

• AI is the theory and development of computer systems able to perform tasks normally requiring human intelligence (e.g. visual perception, speech recognition, translation between languages, etc.).
• ML is a sub-field of AI that provides systems the ability to learn and improve by itself from experience without being explicitly programmed.
• DL is a type of ML built on a deep hierarchy of layers, with each layer solving different pieces of a complex problem. These layers are interconnected into a “neural network.” A DL framework is SW that accelerates the development and deployment of these models.

See “Artificial Intelligence is not Fake Intelligence” for more details on AI | ML | DL.

And the business ramifications are staggering (see Figure 1)!

Figure 1: Source : McKinsey

And Senior Executives seem to have gotten the word.  BusinessWeek (October 23, 2017) reported a dramatic increase in mentions of  “artificial intelligence” during 363 third quarter earnings calls (see Figure 2).

Figure 2: Executives Mentioning “Artificial Intelligence” During Earnings Calls

To help our clients exploit the business and operational benefits of AI | ML | DL, Dell EMC has created “Ready Bundles” that are designed to simplify the configuration, deployment and management of AI | ML | DL solutions.  Each bundle includes integrated servers, storage, networking as well as DL and ML frameworks (such as TensorFlow, Caffe, Neon, Intel BigDL, Intel Nervana Deep Learning Studio, Intel Math Kernel Library-Deep Neural Networks, and Intel Machine Learning Scaling Library) for optimized ML or deep learning.

Driving AI | ML | DL Democratization

Democratization is defined as the action/development of making something accessible to everyone, to the “common masses.”  History provides democratization lessons from the Industrial and Information Revolutions.  Both of these moments in history were driven by the standardization of parts, tools, architectures, interfaces, designs and trainings that allowed for the creation of common platforms.  Instead of being dependent upon a “high priesthood” of specialists to assemble your guns or cars or computer systems, organizations of all sizes where able to leverage common platforms to build their own sources of customer, business and financial differentiation.

AI | ML | DL technology stacks are complicated systems to tune and maintain, expertise is limited, and one minimal change of the stack can lead to failure.  The AI | ML | DL market needs to go through a similar “standardization” process in order to create AI | ML | DL platforms that enable organizations of all sizes to build their own sources of customer, business and financial differentiation.

To help accelerate AI | ML | DL democratization, Dell EMC has created Machine Learning and Deep Learning Ready Bundles.  These pre-packaged Ready Bundles de-risk and simplify AI | ML | DL projects and accelerate time-to-value by pre-integrating the necessary hardware and software.

No longer is a siloed knowledge group of specialists required to stand up your AI | ML | DL environments.  Instead, organizations can focus their valuable data engineering and data science resources on creating new sources of customer, business and operational value.

Monetizing Machine Learning with Dell EMC Consulting

Across every industry, organizations are moving aggressively to adopt AI | ML | DL tools and frameworks to help them become more effective in leveraging data and analytics to power their key business and operational use cases (see Figure 3).

Figure 3: AI | ML | DL Use Cases Across Industries

The business opportunities are plentiful.  So the real challenge isn’t identifying opportunities to exploit ML for business and operational advantage, the real challenges are:

  • Identifying where and how to start integrating AI | ML | DL into business models by envisioning, identifying, validating and prioritizing the potential use cases
  • Building an elastic data platform (data repository or data lake) that enables the organization to capture, enhance, protect and share the organization’s key data and analytics digital assets.

Dell EMC Services exist to help customers bridge the gap across the data science teams, IT teams, and lines of business. Working together allows us to take the journey with you from deployment to use case development to full production.

Below are two examples of where Dell EMC has helped clients to integrate AI | ML | DL into their key business and operational processes.

Use Case #1:  Bladder cancer identification using medical image recognition

Image recognition of the human body is expected to improve drastically to help doctors with better and more accurate medical diagnostics. ML applied to image recognition of organs, even in the presence of disease, can minimize the possibility of medical errors and speed up disease diagnosis. This is important in many cases because a delay in diagnosis means delays in treatment. Due to the promise of these methods, medical imaging technologies will have a key role in the future of medical diagnostics and therapeutics in the very near future.

For this engagement, we used Magnetic Resonance Images (MRI) from the Cancer Imaging Archives to identify bladder cancer on patients using unsupervised and supervised ML techniques.  The algorithms identified significant differences between the images and enabled physicians to see what features can be relevant for bladder cancer detection.  ML can use techniques to reduce the noise of the images and to deliver better outcomes (see Figure 4).

Figure 4: Leveraging Machine Learning to accelerate Bladder Cancer Detection

The precision of the ML algorithms will increase the accuracy of the results delivered. The benefits delivered across the globe will continue to improve as more image data becomes available. Additionally, more ML models will be trained and the effectiveness of those models will be continuously refined.

It is reasonable to say that Computer Aided Tumor diagnosis using AI | ML | DL techniques will deliver important benefits to society. It will permit a reduction in the costs of healthcare and reduce the time-to-treatment while driving more effective outcomes (see Figure 5).

Figure 5: Leveraging AI | ML | DL to Augment Human Decision-making

Note:  This use case recently won the 2017 award as “Best of Applied Data Analytics” from Dell EMC’s Proven Professional Knowledge Sharing program.

Use Case #2:  Crop disease identification

Human society needs to increase food production by an estimated 70% by 2050 to feed an expected population size that is predicted to be over 9 billion people.  Currently, infectious diseases reduce the potential crop yield by an average of 40% with many farmers in the developing world experiencing yield losses as high as 100%[1].

The situation is particularly dire for the 500 million smallholder farmers around the globe, whose livelihoods depend on their crops doing well. In Africa alone, 80% of the agricultural output comes from smallholder farmers.

The widespread distribution of smartphones among crop growers around the world, with an expected 5 billion smartphones by 2020, offers the potential of turning the smartphone into a valuable tool for diverse communities growing food.  One potential application is the development of mobile disease diagnostics through Deep Learning and crowdsourcing.

The results of the engagement were very impressive in scoring different types of crops and their risk to unhealthy situations (see Figure 6).

[1] “An open access repository of images on plant health to enable the development of mobile disease diagnostics” https://arxiv.org/abs/1511.08060

Figure 6: Crop Disease Identification and Scoring

The Dell EMC Ready Solutions + Dell EMC Consulting = Intelligent Enterprise

Dell EMC Consulting provides a full portfolio of Services designed to help our clients to accelerate AI | ML | DL adoption and monetize their data assets (see Figure 7).

Figure 7: Dell EMC Consulting Tying Together the AI | ML
DL Use Cases

The end goal for any organization is to master the use of AI | ML | DL to derive and drive customer, product, and operational value across the entire organization; to create the “Intelligent Enterprise” that has the ability to continuously learn and adapt to changing business, environmental, competitive and economic conditions (see Figure 8).

Figure 8: Creating “Intelligent Enterprises”

The future is now, and Dell EMC has joined the AI | ML | DL Ready Bundles with Dell EMC Consulting to accelerate the customer journey to the “Intelligent Enterprise.” For additional information, please visit dellemc.com/services.

The post Democratizing Artificial Intelligence, Deep Learning and Machine Learning with Dell EMC Ready Solutions appeared first on Dell EMC Big Data.

Democratizing Artificial Intelligence, Deep Learning, Machine Learning with Dell EMC Ready Solutions

$
0
0
EMC logo

Artificial Intelligence, Machine Learning and Deep Learning (AI | ML | DL) are at the heart of digital transformation by enabling organizations to exploit their growing wealth of big data to optimize key business and operational use cases.

  • AI (Artificial Intelligence) is the theory and development of computer systems able to perform tasks normally requiring human intelligence (e.g. visual perception, speech recognition, translation between languages, etc.).
  • ML (Machine Learning) is a sub-field of AI that provides systems the ability to learn and improve by itself from experience without being explicitly programmed.
  • DL (Deep Learning) is a type of ML built on a deep hierarchy of layers, with each layer solving different pieces of a complex problem. These layers are interconnected into a “neural network.” A DL framework is SW that accelerates the development and deployment of these models.

See “Artificial Intelligence is not Fake Intelligence” for more details on AI | ML | DL.

And the business ramifications are staggering! (see Figure 1)

Figure 1: McKinsey “What’s Now and Next in Analytics, AI and Automation

 

And Senior Executives seem to have gotten the word. BusinessWeek (October 23, 2017) reported a dramatic increase in mentions of “artificial intelligence” during 363 third quarter earnings calls (see Figure 2).

Figure 2: Executives Mentioning “Artificial Intelligence” During Earnings Calls

 

To help our clients exploit the business and operational benefits of AI | ML | DL, Dell EMC has created “Ready Bundles” that are designed to simplify the configuration, deployment and management of AI | ML | DL solutions. Each bundle includes integrated servers, storage, networking as well as DL and ML frameworks (such as TensorFlow, Caffe, Neon, Intel BigDL, Intel Nervana Deep Learning Studio, Intel Math Kernel Library-Deep Neural Networks, and Intel Machine Learning Scaling Library) for optimized ML or deep learning.

Driving AI | ML | DL Democratization

Democratization is defined as the action/development of making something accessible to everyone, to the “common masses.” History provides democratization lessons from the Industrial and Information Revolutions. Both of these moments in history were driven by the standardization of parts, tools, architectures, interfaces, designs and trainings that allowed for the creation of common platforms. Instead of being dependent upon a “high priesthood” of specialists to assemble your guns or cars or computer systems, organizations of all sizes where able to leverage common platforms to build their own sources of customer, business and financial differentiation.

AI | ML | DL technology stacks are complicated systems to tune and maintain, expertise is limited, and one minimal change of the stack can lead to failure. The AI | ML | DL market needs to go through a similar “standardization” process in order to create AI | ML | DL platforms that enable organizations of all sizes to build their own sources of customer, business and financial differentiation.

To help accelerate AI | ML | DL democratization, Dell EMC has created Machine Learning and Deep Learning Ready Bundles. These pre-packaged Ready Bundles de-risk and simplify AI | ML | DL projects and accelerate time-to-value by pre-integrating the necessary hardware and software (follow this link for more information on the Dell EMC Machine Learning Ready Bundles with Hadoop).

No longer is a siloed knowledge group of specialists required to stand up your AI | ML | DL environments. Instead, organizations can focus their valuable data engineering and data science resources on creating new sources of customer, business and operational value.

Monetizing Machine Learning with Dell EMC Consulting

Across every industry, organizations are moving aggressively to adopt AI | ML | DL tools and frameworks to help them become more effective in leveraging data and analytics to power their key business and operational use cases (see Figure 3).

Figure 3: AI | ML | DL Use Cases Across Industries

 

The business opportunities are plentiful. So the real challenge isn’t identifying opportunities to exploit ML for business and operational advantage, the real challenges are:

  • Identifying where and how to start integrating AI | ML | DL into business models by envisioning, identifying, validating and prioritizing the potential use cases
  • Building an elastic data platform (data repository or data lake) that enables the organization to capture, enhance, protect and share the organization’s key data and analytics digital assets.

Dell EMC Services exist to help customers bridge the gap across the data science teams, IT teams, and lines of business. Working together allows us to take the journey with you from deployment to use case development to full production.  Below are two examples of where Dell EMC has helped clients to integrate AI | ML | DL into their key business and operational processes.

Use Case #1:  Bladder Cancer Identification Using Medical Image Recognition

Image recognition of the human body is expected to improve drastically to help doctors with better and more accurate medical diagnostics. ML applied to image recognition of organs, even in the presence of disease, can minimize the possibility of medical errors and speed up disease diagnosis. This is important in many cases because a delay in diagnosis means delays in treatment. Due to the promise of these methods, medical imaging technologies will have a key role in the future of medical diagnostics and therapeutics in the very near future.

For this engagement, we used Magnetic Resonance Images (MRI) from the Cancer Imaging Archives to identify bladder cancer on patients using unsupervised and supervised ML techniques. The algorithms identified significant differences between the images and enabled physicians to see what features can be relevant for bladder cancer detection.  ML can use techniques to reduce the noise of the images and to deliver better outcomes (see Figure 4).

Figure 4: Leveraging Machine Learning to Accelerate Bladder Cancer Detection

 

The precision of the ML algorithms will increase the accuracy of the results delivered. The benefits delivered across the globe will continue to improve as more image data becomes available. Additionally, more ML models will be trained and the effectiveness of those models will be continuously refined.

It is reasonable to say that Computer Aided Tumor diagnosis using AI | ML | DL techniques will deliver important benefits to society. It will permit a reduction in the costs of healthcare and reduce the time-to-treatment while driving more effective outcomes (see Figure 5).

Figure 5: AI | ML | DL Augments Human Decision-making in Healthcare

 

Note:  This use case recently won the 2017 award as “Best of Applied Data Analytics” from Dell EMC’s Proven Professional Knowledge Sharing program.

Use Case #2: Crop Disease Identification

Human society needs to increase food production by an estimated 70% by 2050 to feed an expected population size that is predicted to be over 9 billion people.  Currently, infectious diseases reduce the potential crop yield by an average of 40% with many farmers in the developing world experiencing yield losses as high as 100%[1].

The situation is particularly dire for the 500 million smallholder farmers around the globe, whose livelihoods depend on their crops doing well. In Africa alone, 80% of the agricultural output comes from smallholder farmers.

The widespread distribution of smartphones among crop growers around the world, with an expected 5 billion smartphones by 2020, offers the potential of turning the smartphone into a valuable tool for diverse communities growing food.  One potential application is the development of mobile disease diagnostics through Deep Learning and crowdsourcing.

The results of the engagement were very impressive in scoring different types of crops and their risk to unhealthy situations (see Figure 6).

Figure 6: Crop Disease Identification and Scoring

The Dell EMC Ready Solutions + Dell EMC Consulting = Intelligent Enterprise

Dell EMC Consulting provides a full portfolio of Services designed to help our clients to accelerate AI | ML | DL adoption and monetize their data assets (see Figure 7).

Figure 7: Dell EMC Consulting Tying Together the AI | ML | DL Use Cases

 

The end goal for any organization is to master the use of AI | ML | DL to derive and drive customer, product, and operational value across the entire organization; to create the “Intelligent Enterprise” that has the ability to continuously learn and adapt to changing business, environmental, competitive and economic conditions (see Figure 8).

Figure 8: Creating “Intelligent Enterprises”

 

The future is now, and Dell EMC has joined the AI | ML | DL Ready Bundles with Dell EMC Consulting to accelerate the customer journey to the “Intelligent Enterprise.” For additional information, please visit dellemc.com/services.

Sources:

Figure 1: McKinsey “What’s Now and Next in Analytics, AI and Automation”

[1] “An open access repository of images on plant health to enable the development of mobile disease diagnostics.” 

The post Democratizing Artificial Intelligence, Deep Learning, Machine Learning with Dell EMC Ready Solutions appeared first on InFocus Blog | Dell EMC Services.


New Online Support Search: Pilot for select countries

$
0
0
EMC logo
At Dell EMC, we are committed to finding ways to continuously improve our customer’s service and support experience. A key element in that process is working in conjunction with our customers to identify and prioritize key process and functionality

Five Ways to Engage With the Hyper-Connected Customer

$
0
0
EMC logo

Technology is ubiquitous and how it has taken over our lives is astounding. Today, it might be a better idea not to check whether someone is connected, but with how many devices they connect to the Internet each day, and for how long.

woman holding coffee cup looking at interactive white board

In 2014, the number of mobile devices connected to the Internet surpassed the world’s entire population.  Also, the average attention span of a consumer has dropped to an average of 8 seconds.  This means it is more important than ever for companies and marketing teams to reach out to this hyper-connected customer in new, innovative ways that match their expectations. Let’s discuss a few ways in which we can positively impact the connection.

User Experience as a Competitive Differentiator

If a customer is always connected, they will expect the same from a company. Unreliable access to a service or a product online is not an option. In this context, “unreliable” also includes overall user experience. Organizations know this – Customer Experience Insight says that 95% of all CEOs indicate user experience is a potential competitive differentiator. But then why do only 37% have a budget dedicated to it?

Whether it is a question of penny-pinching with the marketing budget, or whether it’s uncertainty if user experience efforts should go on the IT budget, it’s certain that user experience optimizations have a positive impact. Better usability makes the connected customer’s life easier, information more easily found and services more frequently used.

Technology innovations embraced by a wide audience have always relied on a selling point that they make consumers’ lives better. The same logic should be true for the technology that businesses use to reach out to customers and for ongoing communication.

Level Up: Gamification

One way companies innovate in this field is with gamification. Essentially, this isn’t very different from coupons and classic customer loyalty programs, but they add a new element to the mix. For an increasing subset of customers, games and interactivity are a given, so it is no surprise that adding this element to user experience will pull in more people in that segment.

While there is no surefire way to guarantee more customer interaction through user experience or gamification, best practices from game design could help out companies. Interactivity is game designers’ core business, after all.

  • Use your audience’s prior experiences to your advantage: for instance, people already understand that the ‘thumbs up’ icon means liking, supporting or improving something.
  • Create a sense of ownership: mostly, this is through personalization. It makes people feel they have a stake in how they interact with you.
  • Dare to be different: standing out with a unique design that many people like and some will dislike is better than to arouse no feelings at all.
  • Design for the right audiences: while an API section on a website should speak to developers, a notification system for users on the go has to be brief and to the point.

You Don’t Need to Change Much to Change a Lot

The most crucial point is that you don’t need to change much to change a lot. Innovation isn’t (always) rocket science. Game changers often start out small. Uber started from the idea that all it should take is one tap or click to call up a taxi. As a consequence, it eliminated centralized management structures and shortened the channel from customer to driver. Small changes can snowball into big disruptions.

It isn’t all about cutting and trimming, however. Some airline companies fight the low-cost competition by offering an end-to-end travel experience, with personal care, taxi and hotel reservations included in the offer.

Content Is Always King

Inbound marketing, or content marketing, is another way to maximize the connection to your customer by offering SEO-friendly content. No matter how well-honed a contact strategy, product positioning or a company’s use of channels is, content is what draws people in. As Michael Volkmann of iDea Group says: “For Google, content is king. For social media, too, content is king. For consumers, content is king.”

The hyper-connected customer navigates more channels than ever, so content should be optimized for each channel. More and more channels are popping up. Companies are experimenting with content on SnapChat or pay social media influencers to fulfill the role of market developers.

In Summary

Attracting the hyper-connected customer can be done in many ways, but here are the five things from the article that stand out:

  1. User experience makes a difference
  2. Gamification can be an asset to draw in people
  3. Game design offers many valuable lessons for marketing
  4. Innovation often starts out small
  5. Content is still the undisputed king

We are living in the 4th industrial revolution and our customer’s expectations have changed. Share with us how you have driven innovation in your organizations to better delight customers.

 

ENCLOSURE:https://blog.dellemc.com/uploads/2017/10/woman-board_1000x500.jpg

A Business Value Approach to IT Investing

$
0
0
EMC logo

Business today requires the effective use of technology to create competitive differentiation, drive growth, and optimize profit.   Yet, while business is dependent on technology, increasing IT expenditures can be viewed as a drag on business results. That’s why evaluating technology investments in terms of business outcomes is so important.

We’re familiar with the stories of how technology has changed not only businesses but entire markets.  The first-to-market coffee chain offering free wifi has a competitive advantage. The on-line book seller that leveraged its infrastructure to become an IT service provider realized phenomenal business growth. The brick and mortar retailer that integrated its supply chain and its distribution centers was able to optimize profits.

The organizations behind these success stories evaluated investments not only in terms of cost but also in terms of the value the investment could bring to the business—the business outcome.

In “The Best Path Forward”, an article by Russ Banham about how CFOs make capital budgeting decisions, the author acknowledges how difficult these decisions are, saying “deciding whether an investment is worth funding is not a job for the fainthearted.”  Knowing that CFOs usually are directly involved in most technology decisions or define the criteria by which they’re evaluated, it’s helpful to know how CFOs think about and evaluate investments.  Banham quotes Mark Partin, CFO of the accounting software company BlackLine, as seeing the CFO’s role as “stitching together [the company’s] strategic growth plan and fundamental investment model, year after year.”  Banham goes on to state that David Hensley, CFO at Power Distribution, discovered “the techniques of capital budgeting can be biased toward certain kinds of projects and rarely give CFOs all the answers…it is often the risker, hardest-to-measure investments that can be most transformative for a company.”

As we’ve collaborated with our customers to help them not only to evaluate the potential value of a technology investment decision but to look backward at the total IT, business and financial impact of that decision, we’ve learned that it’s essential to go beyond the “classic” criteria used to evaluate technology decisions, especially those in the data center.

In the classic business model, IT was a cost center and the key criterion when choosing between investment alternatives was to select the option deemed to have the lowest Total Cost of Ownership (TCO).  But TCO allows us to measure only a small portion of the value of any potential investment. TCO not only fails to recognize the transformative opportunities of technology but also keeps IT relegated to a cost-corner rather than positioning IT leadership as equal partners in the business.

At Dell EMC we collaborate with our customers, applying a business value approach that encompasses strategic goals as well as financial and non-financial criteria that go beyond TCO to demonstrate business and IT value, and to position IT as a champion of better business outcomes.

These value principles underpin our unique Customer Value Program enabling us to help customers assess, or forecast, the value of a converged/hyper-converged solution, implement operational and organizational changes to transform and unlock more value from their investment, and use a proactive, continuous-improvement approach to realize, drive and measure the most value from their investment.

The Customer Value Program leverages the successful transformation projects of many customers who have navigated technology, organizational, resource skill, and process changes to enable better planning decisions, mitigate risk and maximize the probability of desired business outcomes. From detailed guidance, assessment tools and expert advice, to educational and certification programs, to quantifying business, IT and financial outcomes, the Customer Value Program provides a comprehensive process to chart a Transformation journey.  Start yours today by going to http://www.dellemc.com/converged-infrastructure/customer-value/index.htm

 

ENCLOSURE:https://blog.dellemc.com/uploads/2017/10/Tablet-Desktop-Three-Employees-Team-IT-Work-1000x500.jpg

Give Your Customers PC Peace of Mind

$
0
0
EMC logo

There’s no denying that the workplace IT environment is more complex than ever before. To make the lives of you and your customers a whole lot easier, Dell offers a full range of devices that are easy to deploy, manage and maintain throughout their entire lifecycle. The lifecycle of these devices provides you with a number of opportunities to improve and secure your customers’ end-user experiences while consolidating their bottom line … and increasing yours.

The newest generation of tech-savvy workers have well and truly arrived, with each worker now using an average of 3.5 devices to get their work done. So how do you help your customers keep up – especially when their IT budgets aren’t necessarily rising with the tide?

Dependability That Keeps up With Demand

We understand that your biggest priority is maintaining a positive and effective relationship with your customers: encouraging their growth by helping them create and cultivate their personal Digital Transformation.

That’s why at Dell, our focus is on handling the detailed enquiries your customers might have via our industry-leading, co-deployable co-delivery service. Not only will you get devices that can come from the factory pre-imaged, with free tools from Dell that make management of devices so much easier, but you’ll experience peace of mind knowing that we have the answers to any and all of your customers’ requirements – no matter what stage of the lifecycle they’re in.

Redefining the PC Lifecycle

Ultimately, your customers need a fresh approach: cost-effective solutions that keep pace with user demand while freeing up time, money and resources to reinvest in future innovation. That’s why we’re redefining the concept of PC lifecycle management. By moving from separated linear solutions to a fully integrated 4-phase cyclical process, we can help optimize your customers’ PC performance at every level. By eliminating redundancies, significantly improving security and streamlining the entire process end-to-end, your customers can save up to 25 percent on PC management as a whole.

The cycle is simple and consists of four ongoing phases that span from planning to refresh and back again:

  • Plan & Design
  • Deploy & Integrate
  • Manage & Support
  • Optimize or Retire

Plan & Design

Dell provides you with experienced specialists to help you get everything right from the very start. Starting with a thorough assessment of your customer’s specific needs, a well-defined plan is created to optimize the existing IT environment, while managing complex transformations to newer technologies. Through this approach, Dell can help you migrate applications, optimize data and operating systems, implement cloud solutions and roll out critical infrastructure.

Deploy & Integrate

We’ll give you a clear focus on driving efficient modifications implemented as a frictionless user experience. Dell offers a deployment model that can scale to the needs of your customers, regardless of size:

  • Dell ProDeploy Client Suite: a fixed price, off-the-shelf solution that delivers systems pre-imaged and configured.

To ensure that your customers’ end users can exploit all the benefits of this deployment, you can also offer them Dell Education Services, which provides full-service training for both regular users and IT professionals.

Manage & Support

Once deployment and implementation have been established, you’ll want to maximize efficiency while minimizing downtime.  Dell ProSupport provides fast proactive IT support for businesses of all sizes. Your customers get a team of in-region engineers who are available 24×7 and respond proactively to provide them with a single source for hardware and software issues, maximizing their user uptime. In addition to the right support, keeping devices secure is critical for everyone, and services like Dell Endpoint Security Suite Enterprise are specially designed to protect your customer’s most vulnerable processes.

Optimize or Retire

The final phase of the lifecycle is where you can help your customers make informed decisions on which devices should be retired and which should be re-purposed or refreshed. Dell has the experience to help you calculate the optimum time to implement new equipment before it becomes a dangerous hindrance on both productivity and security. When devices need to be disposed of, worry-free services are available that include analysis of resell opportunities and secure data wipes – all compliant with the latest protocols and regulations.

We want to help you make the most of your customers’ PC lifespan. We’ll work behind the scenes with you to ensure you deliver flawless planning right through to re-purposing of assets.

Use the co-brandable PC Lifecycle content to take the message to your customers by downloading it on Campaign Builder here.

If you don’t have access to Campaign Builder, please email the following regional teams:

We also have these assets available on the Digital Marketing Platform, so that you can deploy the email and social content directly to your customers from the tool.

If you don’t have access to the Digital Marketing Platform, please register here.

 

 

 

ENCLOSURE:https://blog.dellemc.com/uploads/2017/10/3_Devices.jpg

Value-Based Care and Industry Consolidation Driving Demand for Vendor Neutral Archives

$
0
0
EMC logo

After investing in health information technology for several years, the healthcare industry has found itself mired in digital data today, and in the years to come.

Indeed, in 2016, IDC, in collaboration with Dell EMC, projected that healthcare stakeholders will produce 2,314 exabytes of data by 2020, a significant increase over the 153 exabytes generated in 2013.

This data growth comes during a time of major transformation around both the delivery of healthcare services, and the way that providers are reimbursed. In the value-based care environment, where payment is tied to clinical efficiency and patient outcomes, healthcare data fragmentation is problematic. Clinicians need access to more data sources and analytics to generate insights and determine the most efficacious treatment for their patients.

The challenge of this evolving industry is that today’s health IT infrastructures were not architected and deployed in a way that streamlines data sharing even within a single institution.

Until recently, healthcare organizations deployed diagnostics tools to meet the needs of individual departments. These isolated projects included localized storage infrastructure, leading to the creation of a new data silo with each additional deployment.  This approach subsequently complicated the task of compiling a complete digital picture of a patient’s health from disparate information sources.

The continued rise in hospital mergers and acquisitions adds further complexity, as healthcare IT systems undergo consolidation. Pressure to better manage costs and significantly improve the patient experience has led providers towards consolidation, but it has not always been easy for merging organizations to synthesize their data along with their administrative operations.

Siloed Infrastructure Unable to Provide a 360-Degree Patient View to Clinicians

Traditional IT infrastructure – and in particular, storage architectures supporting existing and new modalities – represent a significant roadblock for providers seeking an integrated workflow across departments.

Legacy workflows, infrastructures, and storage architectures are not designed to support a 360-degree view of the patient, nor can they handle the accelerated growth of medical imaging data that will eventually feed machine learning and artificial intelligence models geared towards providing clinical decision support.

Historically, if you had three PACS, a physician wanting to look at a patient’s images across all systems would technically have to open three different viewers, log in three different times and search for the patient three different ways. Then the physician would need to manually look at and process the images, and assemble them in their head.

VNA’s Ensure Reliable Access to the Right Data at the Right Time

Fortunately, a solution to healthcare workflow integration for medical imaging does exist in the form of the vendor-neutral archive (VNA). A storage infrastructure that does not require a redesign every time an organization adds new data sources or makes workflow adjustments can significantly improve efficiency and IT agility, offering enhanced insights and more reliable access to the right data at the right time.

Migrating these files to new storage systems during an architecture upgrade, for example, can be a complicated project. Most organizations undertake this type of periodic refresh process every three to five years to prevent hardware failures and upgrade infrastructure capabilities. As organizations generate and store more medical imaging data, the project can get more complex and costly each time.

A VNA can prevent data gaps by managing all updates to DICOM files and pointers, drastically reducing the burdens and costs of this critical process.

A VNA also allows a healthcare organization to integrate viewing capabilities and storage with other health IT solutions regardless of its specific PACS application vendor, and its automated data reconciliation capabilities will result in less time spent on ensuring that healthcare providers are able to retrieve the data they need to make informed decisions.

Ultimately, provider organizations should seek to create future-proof infrastructure that is flexible enough to support a broad range of anticipated performance demands, including advanced data analytics, expansion into private, hybrid, or public clouds, and constantly changing clinical workflows.

The VNA is a foundational component of a healthcare ecosystem predicated on efficiency and quality. The challenge remains making preparations for a VNA deployment and choosing the right strategy for the successful launch of a new system.

To achieve these goals, organizations may wish to partner with infrastructure development vendors who can help them to scale their architecture without downtime and consolidate without detracting from day-to-day performance while reducing or eliminating the burdens of future migrations.

Value-based care and provider consolidation are driving healthcare organizations to reevaluate the status of their current resources, especially health information. While some health systems and hospitals have the financial capital for a VNA deployment, others may have to consider a phased approach. In either case, a business imperative is driving medical imaging integration with other health IT systems to ensure that physicians are making care decisions based on the most pertinent, complete, and timely patient data.

For more information on Dell EMC’s Vendor Neutral Archiving solution, download our white paper here.

ENCLOSURE:https://blog.dellemc.com/uploads/2017/10/Architecture-Interior-Dark-Library-Windows-1000x500.jpg

Viewing all 8970 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>