Quantcast
Channel: Blog | Dell
Viewing all articles
Browse latest Browse all 8962

Hadoop is Just the Beginning: Realizing value from big data requires organizational change – and it’s hard.

$
0
0
EMC logo

Back in the 1990s, ‘decision science’ was all the rage. Really the harbinger of big data, decision science focused on streamlining decision-making and using all available tools and data for advanced modeling. Consolidating and combining disparate, independent functions became a key enabler of decision science. For example: Marketing financial services offerings, if done independently of a Risk Management function, focused mostly on increasing revenue from new accounts. Risk Management, however, also needed to ensure that those new accounts would not ultimately become bad assets. Combining elements of both functions allowed for a more efficient, coordinated process, with better outcomes.

Decision Science, in the example above, was most effective when not just the analytic models were combined, but the organizations as well; the most effective companies employing decision science created new organizations, roles and titles as well as processes focused on coordination and control.

Twenty five years later, decision science is replaced with ‘Data Science.’ Essentially the same concept: deploying better solutions through advanced data access and modeling; except now the data is at massive scale. Companies are deploying new technologies at a record pace, but many of those same companies are neglecting to update organizationally as they would have with decision science because it can be very hard to do: It’s one thing to bring on new technologies, but updating organizations, moving resources around, changing reporting relationships… that’s hard! The result, however, not just inhibits, but actually prohibits change. The real value from big data is not accruing as it should.

To be ultimately effective, big data technology relies on five core enablers:

  1. Use case generation, prioritization and approval – links business value to analytics initiatives
  2. Data lineage and metadata – ties use case implementation to core data assets
  3. Governance, security & access control – ensures regulatory compliance
  4. Success tracking & learning incorporation – enables learning, evolution, and business transformation
  5. Operating model & budget allocation – tackles the often thorny question: who gets control

Simply implementing policies or technologies to install these functions is not enough; the organization must be set up to embody and embrace these core principles. That doesn’t happen without making some hard decisions:

  • If the CxO pays for the infrastructure, does the CxO decide what use cases get deployed and in what order? If not, who does and why?
  • How do analytic models get deployed into production? What controls exist to review and refine the results?
  • On what source systems and data are use cases deployed? Is source system sprawl simply getting covered over or are real data issues being fixed in the process?
  • Who will data lake administrators report to? Data scientists? Does every organization have their own, or are they centralized?

Without tackling and answering these fundamental questions along with deployment of new technology, that technology will not produce anything close to lasting, transformative change.

So how do leading organizations tackle this? How do they make sure that they have a transformed organization to support a transformed big data capability?

To affect these functions, consider two governance bodies: an Analytics Governance Council (AGC) and a Data Governance Council (DGC). These two bodies are made up of current stakeholders and are designed to seamlessly and collaboratively govern a data lake and analytics capability.   Here’s how they look and what they do:

Analytics Governance Council (typically up to 12 members)

  • Agrees on use case prioritization methodology
  • Audits potential use cases to pursue for best combination of implementation feasibility and business benefit
  • Recommends BI tool disposition
  • Oversees and approves DGC
  • Sets data lake / DaaS permission levels for Data Scientists
  • Works to secure approval for use case implementation
  • Meets monthly
  • Made up of senior executives with budgetary authority representing business units or functions from the business

Data Governance Council (no more than 10-12 members)

  • Operates at an operational level – decides on specific data components underlying use cases
  • Defines and approves metadata labels
  • Organizes and rationalizes data sources
  • Advises on access to data and potential compliance issues
  • Sets standard data sources, tables, elements to be used for each use case / calculation
  • Meets weekly (initially)
  • Made up of data stewards or SMEs with direct knowledge of core data sources on which use cases are built

reccommended data analytics

There are real nuances in how these bodies are constructed and in what they do. If done right, however, these constructs allow for real decisions to be made with shared accountability for the results.

They enable the organizational change that must underpin the rapid advances in big data and data science. Without them, the highly touted and much promised step change in productivity, revenue growth and customer experience resulting from big data and advanced analytics cannot accrue.

For more information on how we’ve helped our customers navigate these changes and develop big data and analytics into a formal business practice please visit http://www.dellemc.com/bigdataservices

The post Hadoop is Just the Beginning: Realizing value from big data requires organizational change – and it’s hard. appeared first on InFocus Blog | Dell EMC Services.


Viewing all articles
Browse latest Browse all 8962

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>