ServicesANGLE http://servicesangle.com ... a SiliconANGLE Network Property Wed, 01 Oct 2014 11:30:05 +0000 en-US hourly 1 http://wordpress.org/?v=4.0 The Security Toll Booth https://infocus.emc.com/mat_allen/security-toll-booth/ https://infocus.emc.com/mat_allen/security-toll-booth/#comments Wed, 01 Oct 2014 11:30:05 +0000 https://infocus.emc.com/?p=20398 When it comes to security, it’s easy for most folks to take the sanctimonious high ground – we can’t turn away from   reviewing other people’s failures and shortcomings because… well… it’s better them than me.  Let’s be honest, we all know there’s a version of wacky associated with IT security that’s hard to define and…Read More

The post The Security Toll Booth appeared first on InFocus.

Continue reading ]]>
When it comes to security, it’s easy for most folks to take the sanctimonious high ground – we can’t turn away from   reviewing other people’s failures and shortcomings because… well… it’s better them than me.  Let’s be honest, we all know there’s a version of wacky associated with IT security that’s hard to define and equally hard to not watch.   It’s not about who’s going to win, it’s about who will lose and how badly.

The security version of the mismatch has always been murky because there’s rarely any frame of reference for what a good and fair fight looks like.  We’ve become conditioned to the carnage as “just the way it is” and accepted, if not embraced, our ignorance that any break-in – no matter how complex – is preventable.

From the corporate landscape, the one-sided fight begins and ends with IT security budget funding that is ALWAYS short of minimum meets.  We do not necessarily consider expansion of budget dollars to include industry exposures, infrastructure hardening methods, peer group experiences or anything other than one-dimensional threat management.  And of course, we never get to the advanced application of predictive security or a reference to a comprehensive set of best practices.  In other words, the opponent is past-his-prime and slow.

Most CTOs are cornered into the unenviable position of “betting” on the security odds.  They’re forced to choose which threats to prepare for and hope that none of the items that haven’t been budgeted for materialize.  No matter what the bet, the CTO is going to lose.  It’s not a matter of if but when, and the outcome is always the same – “Oops… ah, we need to talk” moment.  Some estimates have North Americans spending about $40 billion a year on security by 2018 (and that number basically doubles when you add the rest of the world).[1]  By any measure, the money we’re burning on security is a source of inefficiency that only stands to grow.

In my eyes, there’s very little logic, less reason and zero common sense being exercised in the security industry right now.  We’ve all checked out and just concede that hacks, security breaches and data losses are all a normal part of doing business.  They’re a tax… a toll we must pay.  Worse yet, it doesn’t have to be this way.

source: 20th Century Fox

By now, you’ve guessed that my primary beef is the expense associated with the way we are dealing with IT security. I’m also concerned that the institutions that we all rely on to assure that civilized society is adhering to a patchwork of confusing security half-measures that would make Franz Kafka proud and Butch Cassidy & the Sundance Kid blush.  For me, sarcastic to the end, I’m finding less and less humor in the whole “security budget” punch line.  It’s predictable.  It’s boring.  Most alarming, we know this is wrong and we know it’s going to cost us, we just don’t know how much and exactly when.

So how can we balance the mismatch?  The problem is both knowledge and reference.  The business leadership that approves IT security budgets has a limited reference when it comes to a detailed understanding of how extensive security related damages can be.  In many cases, the very leadership that’s tasked with allocating capital to the management of IT security is the same group of people who can’t name the top three web browsers (you’re all laughing right now because you have your own horror story that validates this). Training can clearly solve part of the problem.  Look at it this way – OSHA estimates that safety training reduces workplace injuries by about 35% (resulting in around $20 billion a year in savings).  We’re not asking executives to work a lathe or the green chain in a saw mill.  We’re asking them to demonstrate technical understanding of one of the single biggest sources of risk to their respective organizations.

Business people tend to look for sustainability and references they can depend on.  Reference, or a common reference, can be found in a standard.  There’s simply no uniform standard that anyone is pointing to right now.  Yes, we do have ISO 27000, but that’s about as much a standard as “you’ll shoot your eye out kid” is good safety advice.

source: United Artists

So, in the near-term, I need you to get up, open the window, and yell “I’M AS MAD AS HELL, AND I’M NOT GOING TO TAKE THIS ANYMORE!”  Or… better yet, start asking anyone who will listen for a security standard; start asking the consultants and “experts” to establish a meaningful security protocol.  I would’ve said this to begin with, but it’s just not as funny and I wouldn’t have been able to lift a great movie quote to my advantage.

 


 

[1] Contu, Ruggero, Canales, Christian, Pingree, Lawrence, Forecast: Information Security, Worldwide, 2012-2018, 2Q14 Update. No G00264279. Gartner, 5 Aug. 2014.

The post The Security Toll Booth appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/10/01/the-security-toll-booth/feed/ 0
Leverage “By Analysis” To Expand Your Data Science Perspectives https://infocus.emc.com/william_schmarzo/leverage-analysis-expand-data-science-perspectives/ https://infocus.emc.com/william_schmarzo/leverage-analysis-expand-data-science-perspectives/#comments Tue, 30 Sep 2014 19:45:08 +0000 https://infocus.emc.com/?p=20412 As many of you know from my previous blog, I am co-teaching an MBA course at the University of San Francisco with Professor Mouwafac Sidaoui titled “Turning Big Data into Business Power.”  In our second week of class, we conducted an exercise to brainstorm and uncover new metrics that could be improved predictors of business…Read More

The post Leverage “By Analysis” To Expand Your Data Science Perspectives appeared first on InFocus.

Continue reading ]]>
As many of you know from my previous blog, I am co-teaching an MBA course at the University of San Francisco with Professor Mouwafac Sidaoui titled “Turning Big Data into Business Power.”  In our second week of class, we conducted an exercise to brainstorm and uncover new metrics that could be improved predictors of business or operational performance.

To set up the exercise, we discussed the data science lesson of “Moneyball: The Art of Winning an Unfair Game,” which was how to leverage new sources of data and advanced analytics to uncover metrics that may be better predictors of baseball players’ performance (see Figure 1).

Identifying the “Right” Metrics

Figure 1: Identifying Metrics That May Be Better Predictors of Performance

Building on the Moneyball example and continuing with the sports theme, I asked the students to pretend that they were the coach for an NBA basketball team (the Golden State Warriors) that had to play the new Cleveland Cavaliers and their superstar, LeBron James.  Their job was to craft a defensive and game strategy that minimized the offensive impact that LeBron James would have in the game in order to maximize our chances (probability) of winning.

We started the exploration process by analyzing the shooting chart below, which shows LeBron James’ shooting percentages from different spots on the court (see Figure 2).

LeBron James Shooting Effectiveness

Figure 2: James LeBron’s Shooting Effectiveness

http://grantland.com/the-triangle/courtvision-how-the-heat-and-spurs-score/

While this chart is interesting in highlighting areas in general where LeBron’s shooting percentages are better or worse, to be actionable we needed to get the data and analytics at the next level of detail.

Introducing the “By Analysis”

The By Analysis is a technique used in Business Intelligence for arranging, analyzing, and reporting on data. The “By” analysis leverages a business stakeholder’s natural exploration, question and query process to uncover:

  • Metrics, measures and key performance indicators
  • Dimensions (e.g., strategic nouns) and dimensional attributes and characteristics
  • Areas for potential analytics exploration

The By Analysis uses a simple “I want to see [X] by [Y]” format to fuel the business stakeholder brainstorming process and uncover their data and analytic requirements.  For example:

  • “I want to see sales and product margin by product category, store, store remodel date, day of week, store demographics, and customer demographics”
  • “Show me social activity by customer, social media channel, subject area, sentiment, number of likes/favorites, number of shares/retweets, number of engaged personnel, time of day, and day of week”
  • “I want to track hospital admissions by disease class, zip code, patient demographics, hospital size, payer, area demographics and day of week“
  • “I want to compare current versus previous maintenance issues by turbine, turbine manufacturer, maintenance person, date last serviced and weather conditions”

By Analysis Exercise

So the question that I posed to the class was the following:  “At what levels of detail would you want to understand LeBron’s shooting percentages or performance in order to increase the probability of your defense containing LeBron?”  The list below (as well as the image in Figure 3) shows some of the variables that the students came up with:

  • Opposing team
  • Defender
  • Game location
  • Home or Away game
  • Game location’s elevation
  • Game time weather
  • Game time temperature
  • Game time humidity
  • Time (hours) since last game
  • Average time of ball possession
  • Time left in game
  • Total minutes played in game
  • # of shots attempted
  • # of shots made
  • Location of shots attempted
  • Location of shots made
  • Volume of boo’s
  • # of Fouls
  • # of Assists
  • Playing a former team
  • Time of day
  • Record of opponent
  • Feelings toward opponent
  • Last game’s performance
  • # of negative twitter comments
  • Stadium temperature
  • Stadium humidity
  • # of fans in attendance
  • # of Lebron jerseys in attendance

 

Yes we use a chalkboard.  Feels so old school!!

Figure 3: Yes we use a chalkboard. Feels so old school!!

What was interesting about this exercise (besides the depth and breadth of the suggested variables to better predict LeBron James’s performance) was the opportunity to combine variables to create potentially even more predictive, composite metrics.  To create this composite metrics required another round of brainstorming to understand what it was we were trying to predict.  For example, we wanted to understand and predict LeBron’s:

  • Tired Index.  This metric would be used to measure how tired LeBron is at different times of the game.  This index could be a combination of the following metrics: hours since last game, minutes handling ball, number of shots taken, time remaining in game, away or home game and minutes played in game.
  • Motivation Index.  This index would be a measure of how “motivated” LeBron is for this particular game, and how hard he is willing to push himself to get the win.  This index could be a combination of the following metrics: game’s performance, record of opponent, defender, volume of boos, playing a former team and number of LeBron jerseys in the stands.

For example, updating LeBron’s Tired Index throughout the game (since many of the supporting metrics change during the game) can lead to in-game recommendations regarding defenders, defensive scheme, double-teaming, denying him the ball and pressing him on the perimeter.

It is interesting how the combination of multiple minor measures has the potential to yield a much more actionable and predictive measure.  And while many of these variables may be correlated with each other, in combination those correlations actually work to one’s advantage to create better predictive metrics that can used to drive actionable recommendations.

Summary

The By Analysis is a useful tool in not only helping to understand the key metrics and dimensions of the business, but can yield insights into areas of the business ripe for analysis.  And when you consider how various KPIs can be grouped from a business perspective, you might find opportunities to leverage the results of the By Analysis to create new composite metrics (or scores / indices) that are better predictors of overall strengths or weaknesses.

The By Analysis is an exercise that you can leverage with your business stakeholders to unlock new insights and help guide your strategic initiatives. Regardless of your industry, you can integrate predictive metrics in this manner to optimize your key business processes and uncover new monetization opportunities. It’s your data – you just need to decide what you want to do with it.

The post Leverage “By Analysis” To Expand Your Data Science Perspectives appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/09/30/leverage-by-analysis-to-expand-your-data-science-perspectives/feed/ 0
IT as a Sustainability Enabler: Deconstructing Agility https://infocus.emc.com/sheppard_narkier/sustainability-enabler-deconstructing-agility/ https://infocus.emc.com/sheppard_narkier/sustainability-enabler-deconstructing-agility/#comments Tue, 30 Sep 2014 17:00:35 +0000 https://infocus.emc.com/?p=20349 Reflecting on the Future As we reflect upon the trends that emerged out of VMworld, some enabling technology has clearly grabbed the imagination and headlines. Among the most intriguing are the software-defined data center and its flexible infrastructure, which can be used to develop relevant applications in a more agile manner. The linkage between the…Read More

The post IT as a Sustainability Enabler: Deconstructing Agility appeared first on InFocus.

Continue reading ]]>
Reflecting on the Future

As we reflect upon the trends that emerged out of VMworld, some enabling technology has clearly grabbed the imagination and headlines. Among the most intriguing are the software-defined data center and its flexible infrastructure, which can be used to develop relevant applications in a more agile manner. The linkage between the more flexible infrastructure and the more agile approach requires some serious investment and rethinking in terms of internal processes and skills to take advantage of what the industry has termed DevOps.

Anyone who has had to develop software using archaic methods and tools—while fighting business analysts, compliance, procurement, operations and QA every step of the way—will easily embrace a new way of thinking. I have often referred to this type of exercise as running a marathon in a pool of Jell-O.

Many business executives feel their own version of frustration as well, not seeing the value of their investment as markets move and they are left in the dust, weighed down by legacy environments. Their constant cry is “I spent the money on the technology, why isn’t IT meeting my technology needs?”

Earlier this year I wrote a blog about IT’s role in fostering sustainable advantage including agility. The blog was prescriptive at a 10,000 foot level. What I will explore here is the meaning of agility as perceived by various stakeholders from two different cultures (IT and the business), who think in fundamentally different ways.

A Snapshot of Agility

When speaking to business executives, agility means reacting to all kinds of market and regulatory  changes by delivering new products and services faster; often by leveraging what has been done before, but organizing teams and processes in different ways. IT wants to “do things quicker” to please the business, but this is a challenge given the difficulty imposed by legacy environments. In either cultural view, these goals have to be supported by investment in technical products which is the easiest most transparent of efforts. The more difficult, less obvious, messy investment must be placed in changing various well-entrenched processes and augmenting people’s skill sets. And scariest of all, the culture that surrounds all of this activity must change.

Magnifying glassThe recurring problem with investments in agility-oriented programs is that people think of them as monolithic tasks. Instead “agility” is composed of many highly dependent factors that need to be balanced as they are implemented. Let’s look at two major agility drivers:  business performance agility and time-to-market agility. I have consistently advocated that effective use of IT starts with a business demand view, hence the order of the list below:

Business Performance Agility (BPA). In essence this is how business can predict, lead, or react to market changes that are consistent with their brand, client demographics, risk tolerance, and cost concerns.

    • How the business measures BPA. The business should realize better products and services, enhanced employee productivity, and sensible cost scaling (including IT) that can be correlated with business growth.
    • How the business measures IT support for BPA. IT’s ability to show a justifiable ROI for IT investment; effectively leveraging the organization’s collective information and services by providing cost-effective IT platforms that enhance the firm’s responsiveness, market understanding, and ultimately competitiveness.
    • How the business feels IT’s impact. Employees and consumers experience better usability, better accessibility and support, while enjoying increased information sharing and knowledge generation through a variety of collaboration tactics, all focused on unlocking hidden value.

Time-to-market Agility (TTMA). This metric characterizes IT’s ability to be flexible, responsive, and innovative in support of rapid business transformation necessitated by the business’s quest to maintain its competitive edge by being sensitive and reactive to fluid market conditions.

    • How the business and IT need to measure TTMA. There are two critical agility domains that need consideration:
        • IT is measured by faster integration of applications and information, speedier introductions of new applications, changes to existing ones, and rule updates
        • IT is measured by faster integration of applications and information, speedier introductions of new applications, changes to existing ones, and rule updates
        • IT needs to be able to meet rapidly changing business needs by applying new and/or innovative technology that keeps the firm competitive

Implementing Agility – The Myriad Tactics

ScalesThe next natural question any CIO or CTO would ask is where to start since there is only so much budget, time, and risk that can be absorbed in a single year. At Adaptivity we encouraged our clients to dig deeper, so that we could help them understand the investment strategy required to enhance a relevant sense of agility per business unit.

Business Performance Agility Tactics. The tactics that can be used fall into several broad categories.  Each are important but will vary in need depending upon which part of the business value chain a business unit serves. The reason is simple:  other drivers such as risk and cost compete to varying degrees with agility. Since this comes down to investment, balanced choices must be made.

    • Use of IT. A significant “branding” issue by which IT is judged by the business and is often found lacking, making subsequent investment discussions more difficult.
        • Application Usability:  intuitive UI’s, flexible processes, sensible error handling
        • Environment/ Systems Usability:  investments in resiliency, seamless access across multiple systems
        • Accessibility / Use of External IT Resources:  includes comprehensive mobility and flexibility for partners and suppliers
        • Workflow Enablement:  turning batch processing and email intensive workflow into a distributed easy-to-use workflows
        • User Enablement:  user self-service in an enterprise application storefront
        • User Support:  proactive monitoring of potential or recurring problem areas for heavily used applications so that customer support is aware of these problems before the users start complaining; provides a seamless point of control, without cross-departmental finger pointing.
    • Information Leverage.  Comprehensive integrated search that includes structured and unstructured file search, along with seamless conference and workstream integration.
        • Information / Knowledge Sharing:  sharing and creation that minimizes redundancy and maximizes ease of use
        • Information Accessibility / Format Conversion:  with advent of big data analysis, the ability to push raw data and pull results becomes heightened
        • User Collaboration:  multiuser cross-region collaboration through numerous means, including agile project tracking
        • Information / Reference Data Changes:  a federated view of enterprise data that streamlines data stewardship and enhances the ability to find necessary data with semantic aids
    • Return on IT. This is a major bone of contention that is hard to quantify across two different cultures.
        • User Productivity:  employees, partners and consumers get work done right the first time with minimal errors
        • Business Process Efficiency:  process changes are aided, not impeded by technology
        • Delivery of Business Service / Product:  deliveries can be planned with confidence that the dates will be made
        • Quality of Business Service / Product:  the services and products that IT can control enhance the brand
        • IT Cost Scalability with Business:  modular, traceable costs lead to predictable cost projections for business growth and lulls

Time to Market Agility Tactics. Some of these tactics reinforce those that were used for BPA above. Therefore, some tactics cannot be employed in isolation, but need support, and thus a different level of investment.

    • Flexibility of IT. This is not just perception, but needs a rethinking of how IT processes should work in a modern connected world
        • Application Integration:  a move to micro services architecture which is  a specialization of service-oriented architecture (SOA)
        • Information Integration:  Holistic virtualization of data in a federated model, that allows for source, location, and format independence
    • Responsiveness of IT. An improved perception of IT as being more responsive to changing needs and more willing to enable better business goal alignment.
        • Speed of Application Change:  continuous integration supported by a commitment to DevOps
        • Speed of Application Introduction:  streamlined requirements processing and minimum viable product vetting
        • Speed of Change to Business Rules:  a conscious effort to externalize business logic into parameterized rules that can be easily tested before rollout
    • IT innovation. The fact that IT has gone beyond an “ivory tower” mentality of looking at technology with no goals in mind. A commitment to partnership with the business where it is understood that the innovation needs a business context to be valuable.
        • Ability to Meet Business Needs:  a conscious effort to envision a consequence-based strategy, where IT can be an enabler of strategic change through innovation
        • Ability to Apply New Technology:  IT’s ability to create a temporal, tiered landscape of technology adoption based upon clear business goals (that is, what does the innovation landscape look like 12, 18, 24, or 36 months out?)

Agility’s Bottom Line

There is only so much time, money, energy, and opportunity to expend in the pursuit of agility. So the confusion ends when the business and IT organizations can agree on what the most important goals are. EMC has proven services and a systematic decision support platform that enables executives to cut through noise and make clear decisions about the company’s future.

The post IT as a Sustainability Enabler: Deconstructing Agility appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/09/30/it-as-a-sustainability-enabler-deconstructing-agility/feed/ 0
Taking Pause to Remember Why You Do What You Do https://infocus.emc.com/carolyn_muise/why-you-do-what-you-do/ https://infocus.emc.com/carolyn_muise/why-you-do-what-you-do/#comments Tue, 30 Sep 2014 05:00:59 +0000 https://infocus.emc.com/?p=20337 If you are like most, you spend much of your work day multi-tasking as you work to manage a host of priorities and meet countless demands on your time. Every so often, it is valuable to take a pause to focus on the big picture and remember the answer to, “Why do you do what…Read More

The post Taking Pause to Remember Why You Do What You Do appeared first on InFocus.

Continue reading ]]>
If you are like most, you spend much of your work day multi-tasking as you work to manage a host of priorities and meet countless demands on your time. Every so often, it is valuable to take a pause to focus on the big picture and remember the answer to, “Why do you do what you do?” At EMC, our goal is that each and every person in our company provides the same answer, “To Satisfy the customer!”

At EMC we put a heavy emphasis on our customer’s experience.  We have built a program team that focuses on identifying ways in which we can continually improve our customer’s experience with EMC; however, we firmly believe everyone impacts the customer – some directly, others indirectly – but collectively we [EMC’s internal eco-system] make up EMC’s Total Customer Experience (TCE).

Customer Experience Day Banner

On October 7th, EMC will celebrate its commitment to its customers & partners.  Customer Experience Day sponsored by the Customer Experience Professionals Association (a non-profit group) is celebrated by companies across the world.  EMC is proud to host our first-ever Total Customer Experience Day—Celebrating Globally our commitment to customers.

The day will include live events at 11 EMC offices in 7 countries.  Each site will be customizing their own celebration with virtual access across the globe.

  • ECN Virtual Celebration: Various countries will host TCE experts on ECN.  These EMC professionals will represent various roles at EMC.  They will be available to all ECN members to answer questions and discuss future opportunities to enhance the customer and partner experience.  Find out more HERE.
  • Experience Analytics Showcase: We will also launch the Experience Analytics Showcase on this day.  You may have seen this at EMC World 2014
    in the TCE booth where it was first unveiled.  Everyone enjoyed it so immensely we knew we had to make it available year round.  This interactive Analytics_Showcaseweb tool will be available to everyone to explore how EMC visualizes our customer feedback and other metrics to gain new insights.  Keep an eye on the Total Customer Experience website for more information on this event.
  • Full Executive Engagement: Many of our executives will be on hand at
    execeach site to speak on the importance of the customer experience, recognize groups and individuals for their
    contributions, and to share in the celebration.  The entire leadership team, myself included, are committed to maintaining the customer first culture that starts at the top.

The intent of this day is to take a pause and celebrate our customer-centric culture; to engage EMC employees & customers around our customer experience vision; and gain feedback to assist in driving continuous improvement.

How does your company embrace / celebrate its commitment to the customer experience?

 

The post Taking Pause to Remember Why You Do What You Do appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/09/30/taking-pause-to-remember-why-you-do-what-you-do/feed/ 0
‘Thank You’ Is Just The Beginning https://infocus.emc.com/kevin-roche/thank-just-beginning/ https://infocus.emc.com/kevin-roche/thank-just-beginning/#comments Thu, 25 Sep 2014 16:05:49 +0000 https://infocus.emc.com/?p=20310 Whether you have invested in a global infrastructure using a broad set of solutions from EMC and our Federation partners, or are just in the preliminary conversations about the right product and service mix for your environment, we appreciate your willingness to give EMC the opportunity to earn your business. We do not take for…Read More

The post ‘Thank You’ Is Just The Beginning appeared first on InFocus.

Continue reading ]]>
Whether you have invested in a global infrastructure using a broad set of solutions from EMC and our Federation partners, or are just in the preliminary conversations about the right product and service mix for your environment, we appreciate your willingness to give EMC the opportunity to earn your business.

We do not take for granted that you have a choice in which technology vendor you choose. Therefore, every interaction with our company is a new opportunity for us to prove how we will engage, enable, and evolve with you. Thank you for your trust and partnership and for believing in EMC.

CUBEI recently had the opportunity to discuss the customer experience with Bill Fanrich, SVP and CIO of Blue Cross Blue Shield (BCBS) of Massachusetts, a premier helathcare company. Bill shared the pressure faced by CIOs to be more efficient and effective and how partnering with EMC helped BCBS in its IT transformation. He also discussed how that transformation allowed BCBS to deliver a better experience to its customers. To watch the full interview, click here.

As an industry leader it is so important to listen to the needs of our customers-to make sure that both their challenges and business initiatives are understood. These personal insights, combined with robust analytics and feedback mechanisms, are what allow EMC to continuously improve and innovate customer experience.

Total Customer Experience Day

On October 7, 2014, EMC will be sponsoring our Total Customer Experience Day event – a global celebration of our commitment to customers. The day will include a virtual celebration hosted on the EMC Community Network, open to customers and employees, as well as onsite events at 10+ EMC campuses in 7 countries – Egypt, China, India, Singapore, Ireland, Russia and the US. Not only will EMC celebrate on this day, but we join other companies around the world who also recognize the importance of a great customer experience and take time to celebrate as part of “CX (Customer Experience) Day”.

VirtualTCE

Across the company, our leaders are excited for this day and its three main goals:

  • Celebrate EMC’s customer-centric culture
  • Recognize EMC’s passionate and committed employees
  • Gain insights from customers and employees to continuously improve the EMC experience

Your feedback and experience matters and I hope that you will join us for our virtual celebration on October 7th, where  you will hear from EMC leaders, customers and employees and have the opportunity to share your perspective via a Live Q&A discussion with EMC experts.

TCE-DayJust as we hope to share our approach with you, we are very interested to learn how you are innovating the experience for your customers. We can only get better when we work together – so please come celebrate with us!

On behalf of EMC’s leadership team, I sincerely say “Thank you.” This is just the beginning to an enduring partnership ahead.

For more insights on IT trends from senior leaders at EMC, see our Reflections Blog.

The post ‘Thank You’ Is Just The Beginning appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/09/25/thank-you-is-just-the-beginning/feed/ 0
Finally Reflecting on VMworld 2014 https://infocus.emc.com/mark_browne/finally-reflecting-vmworld-2014/ https://infocus.emc.com/mark_browne/finally-reflecting-vmworld-2014/#comments Thu, 25 Sep 2014 14:00:11 +0000 https://infocus.emc.com/?p=20270 VMworld 2014 in San Francisco was my first VMworld experience. The following blog is my attempt to describe that experience from my own perspective. I’ll do my best to make sense as much as possible. But if I summarized it in one sentence, then I would say the following. VMWorld was a pleasantly and professionally…Read More

The post Finally Reflecting on VMworld 2014 appeared first on InFocus.

Continue reading ]]>
mosconeVMworld 2014 in San Francisco was my first VMworld experience. The following blog is my attempt to describe that experience from my own perspective. I’ll do my best to make sense as much as possible. But if I summarized it in one sentence, then I would say the following. VMWorld was a pleasantly and professionally overwhelming experience.

I have attended many conferences and established many peer to peer personal connections at them. What was a revelation at VMWorld was the ability to understand how deep those connections are. The best summary given of this and phrased better than I ever could is by Hans De Leenher in his blog post entitled “This is Sparta“. I certainly made friends, but also further realized the human element of the social connections I have established.

As well as the connections, I was able to have the VMWorld experience and get a feeling for how this show ran, compared to others I attended. Now this experience was not necessarily better than other conferences I attended, but it was certainly different. I had a lot more conversations with other vendors and other partners than I normally would have.

man hug

For the purpose of my professional reasons for being there, it was very successful. For the EMC Ask the Expert Program we successfully executed a great “EMC Ask the Expert Tweet Chat “on the new RecoverPoint release. It was a well engaged tweet chat and a prelude to the Ask the Expert discussion on the EMC Support Community.  So too was the EMC Ask the Expert discussion I ran covering the VMWorld event.

drew

For the EMC Elect it was a great interactive event. We held a private briefing for the Elect just ahead of the RecoverPoint announcement. There were also a good number of the EMC Elect who spoke at breakout sessions, running hands on Labs and engaging as Social Champions at various breakout sessions. And as ever throughout the conference, we tweeted, met up, had numerous discussions, and good fun that can all be seen on twitter under #EMCElect and the few photos I include here.

dathbrun

emcelect

rolltide

chad

nixfred

And its worth mentioning too that the#EMCElect had a team in the #v0dgeball championship. This is an annual event that goes towards a great cause that is the Wounded Warrior Charity. We were knocked out in our first game  :)  but were delighted to take part in such a noble event for a great charity. Well done to EMC Elect member Fred Nix for spear heading this event, year after year. And well done to my partner in crime on the Ask the Expert Program, Roberto Araujo for stepping up as a referee.

davehenry

alex

In summary, VMWorld US 2014, for me was a very valuable and fulfilling experience. I learned a lot, connected with some fantastic people, took part in a great charity event and had a birthday too  :)  and in contrast to losing our first dodge ball game I smashed my planned work goals out of the park !

And one final thank you is reserved for the brilliant EMCers, Erica McDonaldWill Quinn Steve Knight who did so so much for the EMC events team and for the Elect team particularly.

For more blogs by Mark Browne, visit his blog, Bayside Chronicle.

The post Finally Reflecting on VMworld 2014 appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/09/25/finally-reflecting-on-vmworld-2014/feed/ 0
Seinfeld and Learning on the 3rd Platform https://infocus.emc.com/ernie-kahane/seinfeld-learning-3rd-platform/ https://infocus.emc.com/ernie-kahane/seinfeld-learning-3rd-platform/#comments Thu, 25 Sep 2014 06:00:19 +0000 https://infocus.emc.com/?p=20242 Recently, the New York Times’ confidential innovation report was leaked. The report addressed the challenge of leveraging their success in print journalism on a digital platform. They noted that having great content alone was inadequate. The challenge was to connect that content to readers online. The “aha” moment for the New York Times was that…Read More

The post Seinfeld and Learning on the 3rd Platform appeared first on InFocus.

Continue reading ]]>
Recently, the New York Times’ confidential innovation report was leaked. The report addressed the challenge of leveraging their success in print journalism on a digital platform. They noted that having great content alone was inadequate. The challenge was to connect that content to readers online. The “aha” moment for the New York Times was that going digital involved playing a new game.

Similarly, we’ve explored in previous blogs the rationale for corporate learning organizations to continue to journey beyond their current 2nd Platform blended learning ecosystem to 3rd Platform cloud-based learning. Reasons to extend to the 3rd Platform include:

• competing for empowered learners who have learning options
• meeting a preference for consumer-grade, on-demand learning
• scaling to global and ‘massive’ audiences
• providing new services, e.g., eLearning as interactive as classroom training
• reducing development and delivery costs

How we perceive the magnitude of change will dictate success in meeting it. Very few people proactively transform themselves. Since change is generally unpleasant and takes us out of our comfort zone, our inclination is to keep doing what we are doing. The danger of this—to paraphrase Clayton Christensen—is that in periods of disruption we may be doing a great job but it’s the wrong job. To successfully transform, we need to counter our inclination to stay the course and instead, adopt an alternative method.

Perhaps you recall the Seinfeld episode when George ‘goes opposite’. He laments that all his decisions turn out badly, so he decides he will do the exact opposite of his inclination. For example, George orders the exact opposite sandwich that he normally orders:

George: “Yes, I will do the opposite. I used to sit here and do nothing, and regret it for the rest of the day, so now I will do the opposite, and I will do something!”
…(He goes over to the woman)
George: “Excuse me, I couldn’t help but notice that you were looking in my direction.”
Victoria: “Oh, yes I was, you just ordered the same exact lunch as me.”
(George takes a deep breath)
George: “My name is George. I’m unemployed and I live with my parents.”
Victoria: ”I’m Victoria. Hi.”

Senfield image of George

I want to suggest that corporate training organizations also counter their inclination and, like George, “go opposite” in three areas as they develop their learning innovation strategies.

 

1. Go Opposite: Think Digital First.
One inclination for corporate training organizations is to view the transition beyond their 2nd Platform learning as business as usual. Let’s call this approach being platform-agnostic; there is no major difference among platforms. In this view we leverage existing courses and design new learning for our cloud-based platform based on the same principles we use for 2nd Platform blended learning. I’ve talked about this as a pipe strategy. In the news business, this platform-agnostic strategy is called shovelware and refers to making print stories available digitally.

The problem with shovelware in the news business and education is that it is ineffective. As the New York Times learned, new multi-media and social strategies were required to grow readership online. Success involves telling digital stories exploiting the capabilities and new tools made possible by the 3rd Platform including commenting, sharing, video, graphics, archival info, backstories, with active promotion and tweets. Those that exploit the tools and resources of the new Platform to create purpose-driven content connect more effectively with readers.

eLearning is fundamentally different on the 3rd Platform – well-designed eLearning is intensely social and dynamic. Learning interactions personalize the content and learning experience. The best MOOCs, for example, pivot and adapt based on learner needs. 3rd Platform design skills are different and crucial.

2. Go Opposite: Connect Learners to Content.
Counter the inclination to believe that content is king and if you build it they will come. On the 2nd Platform, a pervasive mindset is that we do not need to compete for learners. However, content means nothing if learners don’t know about it and if it is not competitively superior. Internet learning is ubiquitous. It is vital that learning organizations develop new capabilities in market analysis, online audience acquisition, learner experience, and connecting with learners to drive participation and loyalty. In an increasingly digital world, an active social media marketing strategy needs to be incorporated as part of learning offerings.

3. Go Opposite: Do a Make-Over for your 2nd Platform Learning.
Discard the inclination to maintain the status quo for your current blended learning development.

MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) and ALFA reviewed viewing habits of more than 100,000 learners in over 6.9 million video sessions. The team found that brevity is critical and viewers tune out videos after 6 minutes. They also found that web-enabled lessons—“existing videos retroactively broken into shorter chunks”—are less effective than lessons designed for digital learning.

The new modular learning on the 3rd Platform changes expectations about how all learning is delivered. Big Data analytics is accelerating brain science findings that can be used to create greater learner engagement and retention. The Learning as a Service mindset of the 3rd Platform should be applied in 2nd Platform ecosystems. Create an ambiance on the 2nd Platform that feels like consumer-grade, modular cloud-based learning.

If you agree that “going opposite” makes sense, developing a digital business plan and skill set is crucial. This includes:
• Instructional Designers who can choreograph modular and ‘digital first’ learning experiences and can implement Learning as a Service (LaaS)
• Marketing and social media specialists who can find and help engage your audience
• Business and market analysts to identify competitive opportunities for learning
• Data Scientists who can analyze and make sense of trends and patterns
• Finding a business partner that can help you accelerate the process

In short, consider the skills needed to develop and manage an Internet store for learning and apply them to your future and current business plan.

I urge you to resist the inclination to simply stay the course as you transition to 3rd Platform cloud-based learning. Be like George, take a deep breath, and…

The post Seinfeld and Learning on the 3rd Platform appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/09/25/seinfeld-and-learning-on-the-3rd-platform/feed/ 0
Hadoop Data Modeling Lessons – by Vin Diesel https://infocus.emc.com/william_schmarzo/hadoop-data-modeling-lessons-vin-diesel/ https://infocus.emc.com/william_schmarzo/hadoop-data-modeling-lessons-vin-diesel/#comments Tue, 23 Sep 2014 19:00:49 +0000 https://infocus.emc.com/?p=20078 As my friends know…okay, my nerdy friends… there are 3 big data topics that give me pause.  These topics are critical from a standpoint of operationalizing big data, but I still have not gotten my head completely around them: What skills, training, capabilities and attitudes does someone with a Business Intelligence/SQL/Statistics background need to learn…Read More

The post Hadoop Data Modeling Lessons – by Vin Diesel appeared first on InFocus.

Continue reading ]]>
As my friends know…okay, my nerdy friends… there are 3 big data topics that give me pause.  These topics are critical from a standpoint of operationalizing big data, but I still have not gotten my head completely around them:

  1. What skills, training, capabilities and attitudes does someone with a Business Intelligence/SQL/Statistics background need to learn in order to become a data scientist?
  2. What is the economic value of data; that is, how do I price data that I might want to buy and/or sell?
  3. In a world of “schema on query”, what tools, skills and design techniques does one need to simplify and standardize the ad hoc schema definition process in order to operationalize the resulting schema and analytic results?

Today’s blog is about that third topic. Which brings us to Xander Cage?  Who is Xander Cage?  One of the original Hadoop developers?  The code name for a super-secret big data project happening within EMC Global Services?  Nah, Xander Cage is a character played by Vin Diesel in the movie “XXX.”  He’s a bad-ass that’s out to save the world from a deadly virus.  A great B-rated shoot ’em up, blow ’em up movie.

movie XXX

Source: Columbia Pictures

There is a scene in the movie where one of the bad guys has the Prague police pinned down from a sniper’s position in a large industrial storage room.  Xander comes into the scene, surveys the situation, grabs a bazooka and utters:

You have a bazooka! Dude, stop thinking Prague Police and start thinking Playstation. Blow [schtuff] up!!!”

This quote is the ultimate in “think differently” messaging, which is exactly what we need to do when we think about data modeling in a big data world.

History of Data Modeling By Bill Schmarzo

The world of data modeling (at least as it pertained to Bill Schmarzo) started with third normal form (3NF) and E. F. Codd.  E. F. Codd defined third normal form data modeling in 1971 as a database normalization technique to improve data base processing while minimizing storage costs. 3NF data modeling was ideal for online transaction processing (OLTP) applications with heavy order entry type of needs.

When I was coding in the early 1980’s, disk space and processing power were extremely expensive. 3NF was designed to minimize the amount of data that we needed to store by ensuring that we eliminated data redundancy in storage. Heck, the entire “Y2K” panic was caused by programmers like me hardcoding “19” into the date (year) field so that we could save two bytes each time that the “year” field was used in calculations or reporting. When we were writing programs in the 1980’s, no one dreamed that our programs would still be running 20+ years later (I wonder if anyone ever found all the Easter eggs that I buried in my code, he-he-he).

As a result, we ended up with OLTP data models that looked like Figure 1.

Third Normal Form Sample Data Model

Figure 1: Third Normal Form Sample Data Model

Data Modeling For Business Users

While 3NF was ideal for machine processing, the spaghetti nature of the data model was uninterpretable by a human user.  The world of analysis (a.k.a. query, reporting, dashboards) required a different type of data model that supported analysis such as trend lines, period-to-date calculations (month-to-date, quarter-to-date, year-to-date), cumulative calculations, basic statistics (average, standard deviation, moving averages) and previous period comparisons (year ago, month ago, week ago).

Ralph Kimball, while at Metaphor Computer Systems, pioneered dimensional modeling, or star schemas, in the early 1980s (see Figure 2).  The dimensional model was designed to accommodate the analysis approach of the business users via two important design concepts:

  • Fact tables (populated with metrics, measures or scores) that correspond to transactional systems such as orders, shipments, sales, returns, premiums, claims, accounts receivable, and accounts payable.  Facts are typically numeric values that can be aggregated (e.g., averaged, counted, summed).
  • Dimension tables (populated with attributes about that dimension) that represent the “nouns” of that particular transactional system such as products, markets, stores, employees, customers, and different variations of time.  Dimensions are groups of hierarchies and descriptors that describe the facts.  It is these dimensional attributes that enable analytic exploration, attributes such as size, weight, location (street, city, state, zip), age, gender, tenure, etc.
Dimensional Model (Star Schema)

Figure 2: Dimensional Model (Star Schema)

Dimensional modeling was ideal for business users because it was designed with their analytic thinking in mind.  What do I mean by that?  For example, “By” analysis is a common way for capturing the reporting and analysis needs of the business users during the design phase of an analytic project.  The “By” analysis is consistent with the way the business users tend to frame their data requests such as:

  • I want to see sales and returns by month, store, geography, product category, product type and customer segment
  • I want to compare this month and last month’s claims by policy type, policy rider, zip code, city, state, region, customer tenure and customer age
  • I want to see social sentiment scores trended by social media source, subject, and day of week

Today, all Business Intelligence (BI) tools use dimensional modeling as the standard way for interacting with the data warehouse.

Data Modeling For Data Science

In the world of data science using Hadoop, we again need to think differently about how we do data modeling.  Hadoop was originally designed by Google and Yahoo to deal with very long, flat web logs (see Figure 3).  Heck, Google called it “Big Table[1]” since it was an uber large table, not a series of smaller tables tied together with joins – it was just designed differently.

Sample Log file

Figure 3: Sample Log File

For example, Hadoop accesses data in very large blocks – the default block size is 64MB to 128MB versus relational database accesses block sizes are typically 32Kb or less.  To optimize this block size advantage, the data science team wants very long, flat records.

For example, some of our data scientists prefer to “flatten” a star schema by collapsing or integrating the dimensional tables that surround the fact table into a single, flat record in order to construct and execute more complex data queries without having to use joins (see Figure 4).

Flattening the Star Schema for Data Science work on  Hadoop

Figure 4: Flattening the Star Schema for Data Science Work on Hadoop

Taking this to the next level, the data science team will create an uber long record for each different business functions that can be more easily analyzed using Hadoop (see Figure 5).

Data Science Data Modeling on Hadoop

Figure 5: Using Large Flat Files To Eliminate/Reduce Joins On Hadoop

For example, we could have the following massively long records for an individual for whom we want to analyze:

  • Customer demographics (age, gender, current and previous home addresses, value of current and previous home, history of marital status, kids and their ages and genders, current and previous income, etc.)
  • Customer purchase history (annual purchases including items purchased, returns, prices paid, discounts, coupons, location, day of week, time of day, weather condition, temperatures)
  • Customer social activities (entire history of social media posts, likes, shares, tweets, favorites, retweets, etc.)

One technique that Dr. Pedro Desouza, the head of our data science practice at EMC Global Services, uses in order to avoid too many and frequent joins is to replicate just the key dimensional attributes into the fact table. In this way, he keeps the clear representation of the star schema but eliminates the joins by performing the heavy lifting analysis just on the flat file. The tradeoff is a little bit of data redundancy to keep clarity, but it takes advantage of the Hadoop performance characteristics.

What Does This Mean?

It means is that the way that we designed data models for OLTP applications (using third normal form) and for data warehousing (using dimensional modeling) needs to change to take advantage of the inherent architecture and processing advantages offered by Hadoop.  Data scientists will create flat data models that take advantage of the “big table” nature of Hadoop to handle massive volumes of raw, as-is data.

9 11 14 Bill Image 2My friend (and data scientist) Wei Lin calls this data modeling transition from relational (third normal form) to dimensional (star schema) to universal “The Good, The Bad and The Ugly” (I always had a preference for “Angel Eyes” in that movie, if you know which character that is).  But maybe that’s the wrong title? Maybe the name should be “The Complex, The Elegant and The Simple” that reflects the changing nature of data modeling.

As Xander Cage says, we need to think differently about how we approach data modeling in a big data world using Hadoop.  Old data modeling techniques that may have been optimal in a world of constrained storage and processing power, are no longer the optimal data modeling techniques in a big data world.

We need to embrace the “think differently” mentality as we continue to explore and learn from this brave new world.  It may be a painful transition, but we made the successful transition from third normal form to dimensional modeling…eventually.

 

 


 

[1] In 2004 a Google in-house team of scientists built a new type of database called “Big Table”.  Big Table successfully broke through the barriers presented with a traditional RDBMS to handle massive volumes of semi-structured data (log files). Google scientists published two scholarly research papers on their project, the first describing the Google File System (GFS) and Google MapReduce.  Doug Cutting and Mike Cafarella, two independent developers, convinced Yahoo that this new structure was the solution to their search and indexing challenges, discovered that work.  By 2006 Yahoo had the first prototype called ‘Hadoop’ and in 2008 Yahoo announced the first commercial implementation of Hadoop.  Facebook was a fast follower and soon Twitter, eBay, and other major players were also adopting the technology. (http://data-magnum.com/a-brief-history-of-big-data-technologies-from-sql-to-nosql-to-hadoop-and-beyond/)

The post Hadoop Data Modeling Lessons – by Vin Diesel appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/09/23/hadoop-data-modeling-lessons-by-vin-diesel/feed/ 0
Taking the Time To Celebrate Customers https://infocus.emc.com/dave_matson/taking-time-celebrate-customers/ https://infocus.emc.com/dave_matson/taking-time-celebrate-customers/#comments Tue, 23 Sep 2014 19:00:48 +0000 https://infocus.emc.com/?p=20204 Have you ever noticed all of the official days that are celebrated throughout the world. They range from the noble World Cancer Day on February 4, to the quirky Geek Pride Day on May 25, to the savory International Bacon Day on the first Saturday Before Labor Day. A more complete list can be found here. (Ed. note: Dave submitted this article on…Read More

The post Taking the Time To Celebrate Customers appeared first on InFocus.

Continue reading ]]>
Happy Geek Pride Day 2014Have you ever noticed all of the official days that are celebrated throughout the world. They range from the noble World Cancer Day on February 4, to the quirky Geek Pride Day on May 25, to the savory International Bacon Day on the first Saturday Before Labor Day. A more complete list can be found here. (Ed. note: Dave submitted this article on Talk Like a Pirate Day!)

But what about a day that celebrates the customer? You know – the people that indirectly sign our paychecks?! Consumer-oriented businesses, ranging from Starbucks to Dairy Queen, have hosted Customer Appreciation Days. But it’s less apparent that Fortune 500 companies and larger B2B organizations are embracing this type of appreciation and celebration…until now.

CXPA- Customer Experience Professionals AssociationCustomer Experience Day, or CX Day, was founded by the CXPA, or Customer Experience Professionals Association, and is intended to be “a pause during the year, when your company can take a breather, refresh and recommit to the customer experience, and show the customer love in your company.”

This year, it takes place on October 7th and many organizations are making plans to celebrate in style, including EMC!!!

At EMC, we are putting together a global, company-wide event. Key customers and partners will be joining the celebration in our offices in Hopkinton and throughout the world. Check out EMC’s Community Network for more information and to join the discussion.

October 7th will clearly be a celebration of our valued customers and partners but it’s worth calling out all of the great employees that deliver those great experiences every day. Here are a few of their stories:

Mountain biking Global Adventures video

 

 

 

SailingViews of the Horizon video

 

 

 

Cave divingGoing Deep video

 

 

 

Thankfully, once CX Day has come and gone, we can all start making preparations for World UFO Day which takes place every July 2. First, I’ll need a more powerful telescope!

 “The opinions expressed here are my personal opinions. Content published here is not read or approved in advance by EMC and does not necessarily reflect the views and opinions of EMC nor does it constitute any official communication of EMC.”

The post Taking the Time To Celebrate Customers appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/09/23/taking-the-time-to-celebrate-customers/feed/ 0
How to Become a Data Scientist – Interview #2 https://infocus.emc.com/frank_coleman/become-data-scientist-interview-2/ https://infocus.emc.com/frank_coleman/become-data-scientist-interview-2/#comments Mon, 22 Sep 2014 17:34:37 +0000 https://infocus.emc.com/?p=20047 I’m very fortunate to work with several Data Scientists, all with varying backgrounds and work histories. Aspiring Data Scientists frequently ask me for career path advice. What better way to advise them then by sharing the experiences of EMC Data Scientists? A few months ago, I sat down with Oshry Ben-Harush. This time I have Joseph…Read More

The post How to Become a Data Scientist – Interview #2 appeared first on InFocus.

Continue reading ]]>
Image of Joseph DeryI’m very fortunate to work with several Data Scientists, all with varying backgrounds and work histories. Aspiring Data Scientists frequently ask me for career path advice. What better way to advise them then by sharing the experiences of EMC Data Scientists? A few months ago, I sat down with Oshry Ben-Harush. This time I have Joseph Dery in the “hot seat”:

FC:    What is your education background?

JD:    I have a BS in Statistical Modeling & Marketing Strategy from Babson College, an MS in Business Anatlytics from Bentley University and am now a PhD student in Business Analytics at Bentley.

FC:    How did you become a Data Scientist?

Word cloudJD:    In undergrad I focused on Statistical Modeling and Marketing Strategy. Back then, they were completely separate. I enjoyed both but thought I needed to focus my career on only one of them. My professor challenged me, asking why I could only focus on one. She said Analytics was a way to do both. I then found the Bentley Masters Marketing Analytics program.

This program was 1/3 Statistics, 1/3 Computer Science, and 1/3 Business. It allowed me to use marketing and statistical modeling, both in a business wrapper. It was my first exposure to Data Science.

My first Data Science job was with a New Hampshire boutique company specializing in Data Mining. They blended business, math and computer science using creative approaches and combining methods in unique ways: testing new models, hypothesis testing, pushing boundaries that traditional methods were not able to solve.

FC:    What skills so you think are essential for a successful Data Scientist?

JD:

1)   Story telling – there is so much business focus andData Scientist story telling if you can’t convey what you found in an effective way you are dead in the water. What you found will just sit there if you can’t tell it in a compelling and concise way. People won’t understand what you are talking about.

2)   Creativity – there is a reason why we get the problems we do. Most of the talented business analysts in the company struggle with these problems using traditional means. In order to solve these problems, we need to think outside the box and challenge business assumptions. This requires creativity. Thankfully, our models don’t need to be 100% accurate. We can work with 80% accuracy. The key is finding the right mix of explainability and accuracy to help the business.

Data Scientist courage3)   Courage – We are often fighting what the business traditionally believes. Stand up for what you found but convey it in a way the business can consume.

4)   Statistics and computer science – the ability to manipulate data and extract the meaning. You can get academic training to learn this. The key is to make sure you know enough of each.

FC:    What are the common tools you use in your everyday work?

JD:Etch A Sketch

  • White Board – all projects start out with brainstorming sessions. I wish it were an Etch A Sketch because we often have several white board sessions and I want to shake away the first few.
  • Environment to manipulate raw dataGreenplum or any relational database, Hadoop for unstructured; any combination of the two.
  • Once data is in a format that is workable for Data Mining (unsupervised) and/or Statistical Modeling then I bring in some or all of the following tools:
    • SAS, R, Alpine Miner
    • A lot of training focuses on a tool but the output is ultimately the same. It really doesn’t matter since it’s how you interpret the output that matters.
  • Focus Groups – Bringing in the people who are going to work with the product you produce. After a few iterations of data mining or statistics we share results and see if they agree or if there is something we missed.
  • Platform for resultsTableau, SAS visual analytics
  • Program Management – Training, getting items in to production, and Support

FC:    What is the most important characteristic needed to be a Data Scientist?

Super GlueJD:    Storytelling – before there were Data Scientists you had a Statistician, a Business Consultant, and a Computer Scientist. Now a Data Scientist needs to wear all those hats. A Data Scientist needs to pull all this together and tell a story. It’s no longer good enough to build the algorithm, to prove the hypothesis and to build the database.  You need to be able to tie them all together in a way that makes sense.  This is the glue that holds everything together. Sometimes you have to sacrifice accuracy for explain-ability, which is not something a statistician would naturally do.

FC:    What is the coolest or most impactful project you have worked on?

PowerballJD:    The business problem was how to focus our reps’ attention on the customers most likely to renew. The project code name was Powerball (PB).

It was a cross-functional collaboration that had to come together in a common mission. It was a great example of loose accuracy for explainablity. In fact, the first few iterations were only about 80% accurate on average. Today, we’ve grown the models and have used more advanced techniques. The difference, though, is that our stakeholders are now comfortable with what we provide them – giving us more freedom to get fancy in our modeling. During the initial phases of PB, we spent a lot of time in focus groups and brought in key stakeholders and influencers to take part in the build. It was the first time I had help to deploy a worldwide Data Science process. It was more than a model but an organizational transformation.

FC:    How do you keep current with industry trends?

JD:    Through my academic research. One of the business things I can advocate for is continuing your education. In order to maintain your competitive edge in the industry your education can’t stop because business challenges and data science methods keep evolving.

FC:    What guidance would you give aspiring Data Scientists?

JD:    Find an education program that works for you. Ensure it includes Business, Computer Science and Statistics. Don’t start out trying to go after the big companies. Getting the experience is critical in developing your storytelling and creativity. Working for startups while in school can accelerate this for you!

 

Special thanks to Joe for sharing his experiences and suggestions! Hopefully this will help you aspiring Data Scientists with getting started in an Advanced Analytics/Data Science career. In future posts I will interview other Data Scientists to demonstrate their varying backgrounds and highlight their similarities and differences.

The post How to Become a Data Scientist – Interview #2 appeared first on InFocus.

]]>
http://servicesangle.com/blog/2014/09/22/how-to-become-a-data-scientist-interview-2/feed/ 0