Makeover Monday 2017 – Week 2

It’s time for Makeover Monday – Week 2.  This week’s data set was the quarterly sales (by units) of Apple iPhones for the past 10ish years.  The original article accompanying the data indicated that the golden years of Apple may be over.

So let me start by saying – I broke the rules (or rather, the guidelines).  Makeover Monday guidelines indicate that the goal is to improve upon the original visualization and stick to the original data fields.  I may have overlooked that guideline this week in favor of adding a little more context.

When I first approached the data set and dropped it into Tableau, the first thing I immediately noticed was that Q4 always has a dip compared to the other quarters of the year.

This view contradicted all of my existing knowledge of how iPhone releases work.  Typically every year Apple holds a conference around the middle/end of September announcing the “new” iPhone.  That can either be the gap increase (off year, aka the S) or the new generation.  It lines up such that pre-sales and sales come in the weeks shortly following.  And in addition to that I would suspect that sales would stay heightened throughout the holiday season.

This is where I immediately went back to the data to challenge it and I noticed that Apple defines its fiscal year differently.  Specifically October to December (of the previous year) counts as Q1 of the current year.  Essentially Q1 of 2017 is actually 10/1/16 to 12/31/16.  Meaning that in the normalized world thinking about quarters, everything should be adjusted.

Now I was starting to feel much better about how things were looking.  It aligned with my real world expectations.

I still couldn’t help but feel that a significant portion of the story was missing.  In my mind it wasn’t fair to only look at iPhone sales over time without understanding more data points of the smartphone market.  I narrowed it down to overall sales of smartphones and number of smartphone users.  The idea I had was this: have we reached a point where the number of smartphone users is now a majority?  Essentially the Adoption Curve came to my mind – maybe we’ve hit that sweet spot where the Late Majority is now getting in on smartphones.

To validate the theory and keep things simple, I did quick searches for data sets I could bring into the view.  As if through serendipity, the two additional sources I stumbled upon came from the same as the original (statistica.com).  I went ahead and added them into my data set and got to work.

My initial idea was this: line plot of iPhone sales vs. overall smartphone sales.  See if directionality was the same.  Place a smaller graph of smartphone users to the side (mainly because it was US only, couldn’t find a free global data set).  And the last viz was going to be a combination of the 3 showing basic “growth” change.  That in my mind would in a very basic way display an answer to my questioning.

I went through a couple of iterations and finally landed on the view below as my final.

I think it sums up the thought process and answers the question I originally asked myself when I approached the data set.  And hopefully I can be pardoned (if even necessary) since the accompanying data added in merely enhanced information at hand and kept with the simplicity of data points available (units and time).

#WorkoutWednesday Week 1

Another great community activity is Workout Wednesday hosted by Andy Kriebel and Emma Whyte.  According to Andy it’s “designed to test your knoweldge of Tableau and help you kick on in your development.”  They’re alternating odd vs. even weeks.

Here’s the first task in a visual nutshell (using Superstore data set):

I’m happy to say that I was able to complete the task.  What was the most interesting part?  To get the dots on the single lines I ended up redoing a field that had a secondary table calculation and using some built in functions.  Those functions were RUNNING_SUM() and TOTAL().  The dots continued to be tricky, but I resolved to using AND logic within my IF statement and leveraging LOOKUP().

I also did a micro upgrade.  The instructions indicated that the red should highlight the “most current year.”  When interacting with the viz on the original blog, I noticed that only 2015 was red and the title was static.  So I added in logic to highlight the most recent year and added the dynamic change to the title as well.

Full viz on my Tableau Public page.

Makeover Monday 2017 – Week 1

It’s officially 2017 – the start of a new year.  As such, this is a great time for anyone in the Tableau universe to make a fresh commitment to participate in the community challenge known as Makeover Monday.

As I jump into this challenge, I’ve made the conscious decision to start with the things I already like doing and to add on each time.  This to me is the way that I’ll be able to stay actively involved and enthusiastic.  Essentially: keep it simple.

For this week’s data set it was obvious that something of a comparative nature needed to be applied.  I started off with a basic dot plot and went from there.

What I ended up with: a slope chart with the slope representing the delta in rank of income by gender, the size of the line representing the annual monetary difference in income, and 3 colors representing categorized multipliers on the wage gap.

I wanted this to be for a phone, so I held to the idea of a single viz.  Interactivity is really limited to tooltips, most other nuance comes from the presentation of the visualization itself.

And I pushed myself to add a little journalistic flare this week.  Not really my style, but I figured I would see where it took me.

Book Binge – December Edition

I typically spend the end of my year self reflecting on how things have gone – both the good and the bad.  Usually that leads me to this thoughtful place of “I need more books.”  For some reason to me books are instant inspiration and a great alternative to binge streaming.  They remind me of the people I want to be, the challenges I want to battle and conquer, and seamlessly entangle themselves into whatever it is I am currently experiencing.

Here are 3 of my binges this month:

First up: You are a Badass: How to Stop Doubting Your Greatness and Start Living Your Life by Jen Sincero

This is a really great read.  Despite the title being a little melodramatic (I don’t really believe that I’m not already a super badass, or that my greatness isn’t already infiltrating the world), Jen writes in a style that is very easy to understand.  She breaks down several “self help” concepts in an analytical fashion that reveals itself through words that actually make sense.  There’s a fair amount of brash language as well, something I appreciate in writing.

Backstory on this purchase:  I actually bought a copy of this book for me and 2 fellow data warriors.  I wanted it to serve as a reminder that we are badasses and can persevere in a world where we’re sometimes misunderstood.

To contradict all the positiveness I learned from Jen Sincero, I then purchased this guy: The Subtle Art of not Giving a F*ck by Mark Manson.  (Maybe there’s a theme here: I like books with profanity on the cover?)

Despite the title, it isn’t about how you can be indifferent to everything in the world – definitely not a guide on how to detach from everything going on.  Instead it’s a book designed to help you prioritize the important things, see suffering as a growth opportunity, and figure out what suffering you like to do on a repeated basis.  I’m still working my way through this one, but I appreciate some of the basic principles that we all need to hear.  Namely that the human condition IS to be in a constant state of solving problems and suffering and fixing, improving, overcoming.  That there is no finish line, and when you reach your goal you don’t achieve confetti and prizes (maybe you do), but instead you get a whole slew of new problems to battle.

Last book of the month is more data related.  It’s good old Tableau Your Data by Dan Murray + Interworks team.

I was inspired to buy this after I met Dan (way back in March of 2016).  I’ve had the book for several months, but wanted to give it a shout out for being my friend.  I’ve had some sticky challenges regarding Tableau Server this month and the language, organized layout, and approach to deployment have been the reinforcement (read as: validation) I’ve needed at times in an otherwise turbulent sea.

More realistically – I try to buy at least 1 book a month.  So I’m hoping to break in some good 2017 habits of doing small recaps on what I’ve read and the imprint new (or revisited) reads leave behind.

The Float Plot

One of the more interesting aspects of data visualization is how new visualization methods are created.  There are several substantial charts, graphs, and plots out there that visualization artists typically rely on.

As I’ve spent time reading more about data visualization, I started thinking about potential visualizations out there that could be added into the toolkit.  Here’s the first one that I’ve come up with: The Float Plot.

The idea behind the float plot is simple.  Plot one value that has some sort of range of good/acceptable/bad values and use color banding to display where it falls.  It works well with percentage values.

I’ve also made a version that incorporates peers.  Peers could be previous time period values or they could be less important categories.  The version with peers reminds me somewhat of a dot plot, but I particularly appreciate the difference in size to distinguish the important data point.

What’s also great about the Float Plot is that it doesn’t have to take up much space.  It looks great scaled short vertically or narrow horizontally.

Enjoy the visualization on my Tableau public profile here.

Statistical Process Control Charts

I’ve had this idea for a while now – create a blog post and video tutorial discussing what Statistical Process Control is and how to use different Control Chart “tests” in Tableau.

I’ve spent a significant portion of my professional career in business process improvement and always like it when I can integrate techniques learned from a discipline derived from industrial engineering and apply it in a broader sense.

It also gives me a great chance to brush up on my knowledge and learn how to order my thoughts for presenting to a wide audience.  And let’s not forget: an opportunity to showcase data visualization and Tableau as the delivery mechanism of these insights to my end users.

So why Statistical Process Control?  Well it’s a great way to use the data you have and apply different tests to start early detection.  Several of the rules out there are aimed at finding “out-of-control,” non-normal, or repetitive parts within a stream of data.  Different rules have been developed based on how we might be able to detect them.

The video tutorial above goes through the first 3 Western Electric rules.  Full details on Western Electric via Wikipedia: here.

Rule 1: Very basic, uses the principle of a bell curve to put a spotlight on points that are above or below the Upper Control Limit (UCL) or Lower Control Limit (LCL) also known as +/- 3 standard deviations from the mean.  These are essentially outlier data points that don’t fall within our typical span of 99.7%.

Rule 2: Takes into consideration surrounding observations.  Looking at 3 consecutive observations are 2 out of 3 above or below the 2 SD mark from the average.  In this rule the observations must be on the same side of the average line when beyond 2 SD.  Since we’re at 95% at 2 SD, having 2 out of 3 in a set in that range could signal an issue.

Rule 3: Starts to consider even more data points within a collection of observations.  In this scenario we’re now looking for 4 out of 5 observations +/- 1 SD from the average.  Again, we’re retaining the positioning above/below the average line throughout the 5 points.  This one really shows the emergence of a trend.

I applied the first 3 rules to my own calorie data to see detect any potential issues.  It’s very interesting to see the results.  For my own particular data set, Rule 3 was of significant value.  Having it in line as the new daily data funnels in could prevent me from going on a “streak” of either over or under consuming.

 

Interact with the full version on my Tableau Public profile here.

#MakeoverMonday 11/22/16 – Advanced Logging Edition

And it’s time – my first ever Makeover Monday.  I’ll admit, I’ve attempted to catch up in the past, but always lost steam.  I think the first data set might be related to sports and I struggle to focus on making something interesting.

Despite my follies, I’m proud to say that I’ve participated in this week’s Makeover Monday in honor of the special advanced logging that is taking place.  Along with submitting work with the hashtag on twitter, Tableau has asked for us to upload a copy of our log files and workbook.  Contained within the advanced log files are .PNGs that show analysis iterations.

I went into this Monday with the idea of doing a basic “best practices” version.  One that would mimic something I might create for ultimate exploration and zero data journalism.  I tried to stick with one element that I thought worked well and create the dashboard around it.

Looking at the other participants, I’m already thinking that my time heatmap could be improved.  My mind was stuck on the day numbers and quarters.  I should have switched to days of the week!  Irrespective – here it is:


And the GIF:

makeover-monday-112116

#data16 Data Dump

Last night was our monthly Phoenix Tableau User Group (PHXTUG) meeting and as part of the post-excitement of Tableau’s 2016 conference we took some time to go through their strategy and some upcoming features.

Full video is available here:

Interested in reusing the slides? Find the deck here:

#data16 Day 3

Admittedly I’m jumping from day 1 to day 3.  I hit a micro wall on Tuesday.  But now that I’ve pushed through to Wednesday – it is time to focus on the amazing.

First up – paradigm shift.  I had a very novel vision of expectations and how to “get the most” out of the conference.  This involved the idea of attending several hands-on sessions and maximizing my time soaking in how others solve data problems.  The ‘why’ behind the initial decision: I have a particular love for seeing how other people pull apart problems.  I was once asked what my passion was by a colleague – I said that I loved understanding the universe.  Pulling apart anything and everything, understanding it, cataloging it, figuring out how it fits into existence.  So faced with the opportunity to see how others tackle things was something I had to do.

So what was the paradigm shift?  The conference isn’t just for seeing people solve problems.  It’s about seeing people communicate their passion.  And this happens in a million different ways.  This morning it happened with Star Trek and making data fun and serious.  Later it was 300+ slides of humor secretly injected with sage wisdom.  The word that comes to my mind is intensity.  I think really what I started seeking out was intensity.  And there’s no shortage.

My takeaway: Focus more on the passion and intensity from others.  Soaking this in becomes fuel.  Fuel for improvement, potential, and endless possibilities.  I can always go back and learn the intricate, well documented solutions.  I can’t recreate magic.

Second item – commitment.  Commitment is accountability, following through, sticking it out, dedication.  Commitment is daunting.  Commitment is a conscious choice.  I made a commitment to myself to be present, to engage with others.  Following through has been difficult (and very imperfect), but it has been unbelievably rewarding.  Thinking back to my day 1 thoughts – I fall back to community.  Committing to this community has been one of the best decisions I’ve made.

My takeaway: Human connections matter and are second to none.  Human connections make all the gravy of data visualization, playing with data, and problem solving possible.  (Also when you commit to dancing unafraid at a silent disco, you end up having an amazing day.)

Final item – Try everything that piques interest.  (This one I will keep short because it’s late.)  If you sense something is special, RUN TOWARD IT.  Special is just that: special.  Unique, one-of-a-kind, infrequent.  I think the moments I’ve had while here will turn into what shapes the next year of my life adventures.

My love note for Wednesday – in the books.

#data16 Day 1

What better way to commemorate my first day at #data16 than sharing the highs, lows – what has met expectations and what I didn’t expect.

The community – Probably the one thing I couldn’t anticipate coming into #data16 was how the virtual community (mainly via Twitter) compared to reality.  Like internet dating, you never really know how things are going to be until you meet someone in real life.  Not that I am shocked, but everyone that I’ve met from the blogosphere/twitterverse has been even more amazing than I imagined.  From sitting next to an Iron Viz contestant and forming a friendship on the plane, to getting a ride to my hotel, to meeting up with friends in a crazy food truck alley, to someone shouting my name in the Expo Hall – it’s been a wave of positive energy.

One unexpected component was the local Phoenix community.  It’s been awesome to see familiar faces from Phoenix wandering around Austin soaking in every moment.  I wanted to come to Austin and feel surrounded by familiar and that is definitely something that’s been accomplished.

The venues – When I was 18, I redecorated my childhood bedroom to be more “adult.”  Part of the process was finding the perfect desk for my space.  I somehow stumbled onto an Ikea webpage (mind you, I grew up in a small-ish city in Indiana).  Not knowing too much, I convinced my mom to road trip to Chicago to go to Ikea and buy my perfect desk.  What I expected at the time was to walk into a normal size furniture store.  I couldn’t fathom or anticipate the sheer size the store turned out to be.  That’s been my experience in Austin so far.  Overwhelmingly massive in size with everything being on a grand unexpected scale.  Not bad, just unexpected.  The registration desk had 50+ smiling faces greeting me.

Logistics – I’m still early in the game, so I will have to elaborate after a full day of conference.  So far I’ve been extremely impressed.  I was intimidated by being south of campus.  How would I get around, would I be able to be “in it?”  This has been a non-issue.  Details on transportation have been very transparent and well organized.  There’s been food at every turn, plenty to sustain even the weirdest of diets.

The weather – This has been my only let down!  I can tell it has been rainy off and on, so it is super humid.  For someone used to the dry Arizona air, it’s a little different feeling the moisture in the air.  I’m sure my skin in thankful!  But, tonight I’m left running the A/C to compensate for the moisture.  A huge change from swimming in Phoenix on Sunday to heavy humidity on Monday.

First up for my very full day Tuesday is hopefully a meetup for Healthcare followed up by the morning keynote (I really need to eat some breakfast!).  After that – we’ll see.  I originally anticipated spending the majority of my time in Jedi hands-on sessions.  I love seeing how people solve data problems and figuring out things I can take back, tweak and tinker with.  After today, I’m wondering if I should reevaluate.  The one thing I won’t be able to recreate after this experience are the people, so anytime there’s a schedule clash – for me I am prioritizing networking above all else.

#data16 day one in the books!