A follow up to The Women of #IronViz

It’s now 5 days removed from the Tableau Conference (#data17) and the topic of women in data visualization and the particularly pointed topic of women competing in Tableau’s #IronViz competition is still fresh on everyone’s mind.

First – I think it’s important to recognize how awesome the community reception of this topic has been.  Putting together a visualization that highlights a certain subsection of our community is not without risk.  While going through the build process, I wanted to keep the visualization in the vein of highlighting the contributions of women in the community.  It wasn’t meant to be selective or exclusive, instead, a visual display of something I was interested in understanding more about.  Despite being 5 days removed from the conference, the conversations I’ve been involved in (and observed from a distance) have all remained inclusive and positive.  I’ve seen plenty of people searching for understanding and hunting for more data points.  I’ve also seen a lot of collaboration around solutions and collecting the data we all seek.  What I’m thankful that I have not witnessed is blame or avoidance.  In my mind this speaks volumes to the brilliant and refined members of our community and their general openness and acceptance of change, feedback, and improvement.

One thing circling the rounds that I felt compelled to iterate off of, is @visualibrarian’s recent blog post that has interview style questions and answers around the topic.  I am a big believer in self reflection and exploration and was drawn to her call to action (maybe it was the punny and sarcastic nature of the ask) to answer the questions she put forth.

1. Tell me about yourself. What is your professional background? When did you participate in Iron Viz?

My professional background is that of a data analyst.  Although I have a bachelor’s degree in Mathematics, my first professional role was as a Pharmacy Technician entering prescriptions.  That quickly morphed into someone dedicated to reducing prescription entry errors and built on itself over and over to be put in roles like quality improvement and process engineering.  I’ve always been very reliant on data and data communication (in my early days as PowerPoint) to help change people and processes.  About 2 or 3 years ago I got fed up with being the business user at the mercy of traditional data management or data owners and decided to brute force my way into the “IT” side of things.  I was drawn to doing more with data and having better access to it.  Fast-forward to the role I’ve had for a little over 8 months as a Data Visualization Consultant.  Which essentially means I spend a significant amount of my time partnering with organizations to either enable them to use visual analytics, improve the platforms that they are currently using, or overcoming any developmental obstacles they may have.  It also means I spend a significant amount of time championing the power of data visualization and sharing “best practices” on the topic.  I often call myself a “data champion” because I seek simply to be the voice of the data sets I’m working with.  I’m there to help people understand what they’re seeing.

In terms of Iron Viz – I first participated in 2016’s 3rd round feeder, Mobile Iron Viz.  I’ve since participated in every feeder round since.  And that’s the general plan on my end, continue to participate until I make it on stage or they tell me to stop 🙂

2. Is Tableau a part of your job/professional identity?

Yes – see answer to question #1.  It’s pretty much my main jam right now.  But I want to be very clear on this point – I consider my trade visual analytics, data visualization, and data analytics.  Tableau is to me the BEST tool to use within my trade.  By no means the only tool I use, but the most important one for my role.

3. How did you find out about Iron Viz?

When I first started getting more deeply involved in my local User Group, I found out about the competition.  Over time I became the leader of my user group and a natural advocate for the competition.  Once I became a part of the social community (via Twitter) it was easy to keep up with the ins and outs of the competition.

4. Did you have any reservations about participating in Iron Viz?

Absolutely – I still have reservations.  The first one I participated in was sort of on the off chance because I found something that I want to re-visualize in a very pared down elegant, simplistic way.  I ended up putting together the visualization in a very short period of time and after comparing it to the other entries I felt my entry was very out of place.  I tend to shy away from putting text heavy explanations within my visualizations, so I’ve felt very self-conscience that my designs don’t score well on “story telling.”  It was also very hard in 2016 and the beginning of 2017.  Votes were based off of Twitter.  You could literally search for your hashtag and see how many people liked your viz.  It’s a very humbling and crushing experience when you don’t see any tweets in your favor.

5. Talk me through your favorite submission to Iron Viz. What did you like about it? Why?

Ah – they are all my favorite for different reasons.  For each entry I’ve always remained committed and deeply involved in what the data represents.  Independent of social response, I have always been very proud of everything I’ve developed.  For no other reason than the challenge of understanding a data set further and for bringing a new way to visually display it.  My mobile entry was devastatingly simple – I love it to death because it is so pared down (the mobile version).  For geospatial I made custom shapes for each of the different diamond grades.  It’s something I don’t think anyone in the world knows I did – and for me it really brought home the lack of interest I have in diamonds as rare coveted items.

6. What else do you remember about participating in Iron Viz?

The general anxiety around it.  For geospatial 2017 I procrastinated around the topic so much.  My parents actually came to visit me and I took time away from being with them to complete.  I remember my mom consoling me because I was so adamant that I needed to participate.

Safari and Silver Screen were different experiences for me.  I immediately locked in on data sets on subjects I loved, so there was less stress.  When I did the Star Trek entry I focused on look and feel of the design and was so stoked that the data set even existed.  Right now I am watching The Next Generation nightly and I go back to that visualization to see how it compares to my actual perception of each episode (in terms of speaking pace and flow).

7. Which Iron Viz competitions did you participate in, and why?

Everything since 2016 feeder round 3.  I felt a personal obligation and an obligation to my community to participate.  It was also a great way for me to practice a lot of what I tell others – face your fears and greet them as an awesome challenge.  Remain enthusiastic and excited about the unknown.  It’s not always easy to practice, but it makes the results so worth it.

8. What competitions did you not participate in, and why?

Anything before mobile – and only because I (most likely) didn’t know about it.  Or maybe more appropriately stated – I wasn’t connected enough to the community to know of it’s existence or how to participate.

9. Do you participate in any other (non Iron Viz) Tableau community events?

Yes – I participate in #MakeoverMonday and #WorkoutWednesday.  My goal for the end of 2017 is to have all 52 for each completed.  Admittedly I am a bit off track right now, but I plan on closing that gap soon.  I also participate in #VizForSocialGood and have participated in past User Group viz contests.  I like to collect things and am a completionist – so these are initiatives that I’ve easily gotten hooked on.  I’ve also reaped so many benefits from participation.  Not just the growth that’s occurred, but the opportunity to connect with like-minded individuals across the globe.  It’s given me the opportunity to have peers that can challenge me and to be surrounded by folks that I aspire to be more like.  It keeps me excited about getting better and knowing more about our field.  It’s a much richer and deeper environment than I have ever found working within a single organization.

10. Do you have any suggestions for improving representation in Iron Viz?

  • Make it more representative of the actual stage contest
  • Single data set
  • Everyone submits on the same day
  • People don’t tweet or reveal submissions until contest closes
  • Judges provide scoring results to individual participants
  • The opportunity to present analysis/results, the “why”
  • Blind submissions – don’t reveal participants until results are posted
  • Incentives for participation!  It would be nice to have swag or badges or a gallery of all the submissions afterward

And in case you just came here to see the visualization that’s set as the featured image, here’s the link.

#data17 Recap – A quick top 5

Now that Tableau Conference 2017 has come to a close it’s time to reflect back on my favorite and most memorable moments.  I’ll preface by saying that I had very lofty goals for this conference.  It started after #data16 – immediately after the conference I did some significant thought work on what I wanted my year to look like and HOW I wanted to shape it.  It began by deeply examining my why.  My why is a personal mission that transcends my professional career.  I firmly believe that visualizing data is the BEST way to get closer to the truth and to grow, learn, and improve.  I also strongly feel that every individual should use analytics in making decisions.

Without further ado – here’s my top 5.

#1 – Participating in #MakeoverMonday live

This is my number one because it represents the culmination of a lot of my personal growth and effort this year.  Coming out of #data16 I committed myself to doing every #MakeoverMonday and most specifically participating in this event.  I’ll admit I have a few weeks still outstanding, but I’m on track to have a full portfolio by the end of 2017.  This was also the moment I was most anxious about.  Would I be able to develop something in an hour AND feel comfortable enough to share it with a large audience?  Well at the end of the hour I accomplished everything I had been anticipating and more.  With the support of the amazing #MakeoverMonday family and those around me I got up and presented my viz.  And to boot it became a Viz of the Day (VOTD) the following day.  Talk about a crowning achievement.  Taking something that I had a bit of nerves about and turning it into a super proud moment.

#2 – Community Battle: 60 Second Viz Tips

I think the picture says it all.  A rapid fire tips battle at 9 am the day after Data Night Out?  YES please.  This session was an unbelievable joy to participate in.  Birthed out of the minds of the amazing London Tableau User Group and brought to the sessions of conference.  As if by fate the first community folks that found me when I dropped in on Vegas were none other than Paul Chapman and Lorna Eden.  I can’t be more grateful for the opportunity to contribute to the conference.  And let’s not forget the trophy guys.  This is a new tradition I’d love to see carry out into the conferences to come.

#3 – The Vizzies!

A pure demonstration of the awesome Tableau community that exists globally.  I was so honored to be recognized this year as a Community Leader.  Just take a look at the amazing folks that I am so thankful to be surrounded by.  More than being recognized, the fact that the community was so prominent is unbelievable.  I couldn’t go anywhere without being stopped and having conversations, hugs, or smiles shared between community members.

#4 – Iron Viz!

As I always say – the pinnacle when it comes to Tableau.  The chance to see what 20 minutes of extreme vizzing looks like from the front row.  This one is near and dear to my heart because I submitted for each of the feeder contests this year.  I’d love the opportunity to get up on the big stage and participate, but barring that – it’s an enthusiast’s dream to see the excitement play out on the big stage.

#5 – Fanalytics

Although it is coming in at my #5, this is probably the highest impact moment of conference.  A 3+ hour conversation started by important members of the Tableau community talking about their journey and growth.  It ended with me facilitating one of 8 conversations about important topics the community is facing.  Mine was focused on female participation in #IronViz.  What was interesting about this was the massive feeling that we needed more data to wrap our arms around the topic.  And this has become my first action item post-conference.  I wanted to extend the conversation beyond the round table of remaining conference goers and into the broader community.

I’ve been so inspired, impressed, and energized by all the community and people I encountered over the past week.  I can’t wait to see what the next 12 months look like.

And now that I’ve provided my top 5, I’m curious – what are your top #data17 moments?

Don’t be a Bridge, Instead be a Lock

Lately I’ve spent a lot of time pondering my role in the world of data.  There’s this common phrase that we as data visualization and data analytics (BI) professionals hear all the time (and that I am guilty of saying):

“I serve as the bridge between business and IT.”

Well – I’m here to say it’s time to move on.  Why?  Because the bridge analogy is incomplete.  And because it doesn’t accurately represent the way in which we function in this critical role.  At first glance the bridge analogy seams reasonable.  A connector, something that joins two disparate things.  In a very physical way it connects two things that otherwise have an impasse between them.  The business is an island.  IT is an island.  Only a bridge can connect them.  But is this really true?

Instead of considering the two as separate entities that must be connected, what if we rethought it to be bodies of water at different levels?  They touch each other, they are one.  They are the same type of thing.  The only difference is that they are at different levels, so something like a boat can’t easily go between them.  Is this not what is really happening.  “The business” and “IT” are all really one large organization – not two separate, foreign entities.

This is where the role of being the Lock comes in.  A lock is the mechanism by which watercraft are raised or lowered between waterways.  And to a large extent it is a better analogy to our roles in data.  We must adapt to the different levels of business and IT.  And more importantly it is our responsibility to form that function – and to get the boat (more specifically “the data”) through from one canal to the other.

Even exploring what Wikipedia says about a lock – it fits better.

“Locks are used to make a river more easily navigable, or to allow a canal to cross land that is not level. ”

“Larger locks allow for a more direct route to be taken” [paraphrased]

Is this not how we function in our daily roles?  How fitting is it to say this:

“My role is to make your data more easily navigable.  My goal is to allow data to flow through on your level.  I’m here to allow a more direct route between you and your data.”

It feels right.  I’m there to help you navigate your data through both IT and business waters.  And it is my privilege and honor to facilitate this.  Let’s drop the bridge analogy and move toward a new paradigm – the world where we are locks, adjusting our levels to fit the needs of both sides.

Boost Your Professional Skills via Games

Have you ever found yourself in a situation where you were looking for opportunities to get more strategic, focus on communication skills, improve your ability to collaborate, or just stretch your capacity to think critically?  Well I have the answer for you: pick up gaming.

Let’s pause for a second and provide some background: I was born the same year the NES was released in North America – so my entire childhood was littered with video games.  I speak quite often about how much video gaming has influenced my life.  I find them to be one of the best ways to unleash creativity, have a universe where failure is safe, and there is always an opportunity for growth and challenge.

With all that context you may think this post is about video games and how they can assist with growing out the aforementioned skills.  And that’s where I’ll add a little bit of intrigue: this post is actually dedicated to tabletop games.

For the past two years I’ve picked up an awesome hobby – tabletop gaming.  Not your traditional Monopoly or Game of Life – but games centered around strategy and cooperation.  I’ve taken to playing them with friends, family, and colleagues as a way to connect and learn.  And along the way I’ve come across a few of my favorites that serve as great growth tools.

Do I have you intrigued?  Hopefully!  Now on to a list of recommendations.  And the best part?  All but one of these can be played with 2 players.  I often play games with my husband as a way to switch off my brain from the hum of everyday life and into the deep and rich problems and mechanics that arise during game play.

First up – Jaipur

Jaipur is only a 2 player game that centers around trading and selling goods.  The main mechanics here are knowing when to sell, when to hold, and how to manipulate the market.  There are camel cards that get put in place that when returned cause new goods to appear.

Why you should play: It is a great way to understand value at a particular moment in time.  From being first to market, to waiting until you have several of a specific good to sell, to driving changes in the market by forcing your opponent’s hand.  It helps unlock the necessity to anticipate next steps.  It shows how you can have control over certain aspects (say all the camels to prevent variety in the market), but how that may put you at a disadvantage when trying to sell goods.

It’s a great game that is played in a max of 3 rounds and probably 30 minutes.  The variety and novelty of what happens makes this a fun to repeat game.

Hanabi

Hanabi is a collaborative game that plays anywhere from 2 to 5 people.  The basic premise is that you and your friends are absentminded fireworks makers and have mixed up all the fireworks (numeric sets 1 to 5 of 5 different colors).  Similar to Indian Poker you have a number of cards (3 or 4) facing away from you.  That is to say – you don’t know your hand, but your friends do.  Through a series of sharing information and discarding/drawing cards everyone is trying to put down cards in order from 1 to 5 for particular colors.  If you play a card too soon then the fireworks could go off early and there’s only so much information to share before the start of the fireworks show.

This is a great game to learn about collaboration and communication.  When you’re sharing information you give either color or numeric information to someone about their hand.  This can be interpreted several different ways and it’s up to the entire team to communicate effectively and adjust to interpretation style.  It also forces you to make choices.  My husband and I recently played and got dealt a bunch of single-high value cards that couldn’t be played until the end.  We had to concede as a team that those targets weren’t realistic to go after and were the only way we could end up having a decent fireworks display.

Lost Cities

This is another exclusively two player game.  This is also a set building game where you’re going on exploration missions to different natural wonders.  Your goal is to fill out sets in numeric order (1 to 10) by color.  There’s a baseline cost to going on a mission, so you’ll have to be wise about going off on a mission.  There are also cards you can play (before the numbers) that let you double, triple, or quadruple your wager on successfully going on the exploration.  You and your opponent take turns drawing from a pool of known cards or from a deck.  Several tactics can unfold here.  You can build into a color early, or completely change paths once you see what the other person is discarding.  It’s also a juggling act to decide how much to wager to end up making money.

Bohnanza

This one plays well with a widespread number of players.  The key mechanic here is that you’re a bean farmer with 2 fields to plant beans.  The order in which you receive cards is crucial and can’t be changed.  It’s up to you to work together with your fellow farmers at the bean market to not uproot your fields too early and ruin a good harvest.  This is a rapid fire trading game where getting on someone’s good side is critical and you’ll immediately see the downfall of holding on to cards for the “perfect deal.”  But of course you have to balance out your friendliness with the knowledge that if you share too many high value beans the other farmers may win.  There’s always action on the table and you have to voice your offer quickly to remain part of the conversation.

The Grizzled

The Grizzled is a somewhat melancholy game centered around World War I.  You’re on a squad and trying to successfully fulfill missions before all morale is lost.  You’ll do this by dodging too many threats and offering support to your team.  You’ll even make speeches to encourage your comrades.  This game offers lots of opportunities to understand when and how to be a team player to keep morale high and everyone successful.  The theme is a bit morose, but adds context to the intention behind each player’s actions.

The Resistance

Sadly this requires a minimum of 5 people to play, but is totally worth it.  As the box mentions it is a game of deduction and deception.  You’ll be dealt a secret role and are either fighting for victory or sabotage.  I played this one with 8 other colleagues recently and pure awesomeness was the result.  You’ll get the chance to pick teams for missions, vote on how much you trust each other, and ultimately fight for success or defeat.  You will get insight into crowd politics and how individuals handle situations of mistrust and lack of information.  My recent 9 player game divulged into using a white board to help with deductions!

Next time you’re in need of beefing up your soft skills or detaching from work and want to do it in a productive and fun manner – consider tabletop gaming.  Whether you’re looking for team building exercises or safe environments to test how people work together – tabletop games offer it all.  And in particular – collaborative tabletop games.  With most games there’s always an element of putting yourself first, but you will really start to understand how individuals like to contribute to team mechanics.

The Importance of Certification

I’ve long been a huge advocate of certification in technical fields.  I think it is a great way to actively communicate and demonstrate the skill level you have in a particular area.  Even further in my mind it represents the ability to set a foundation to build off of with others.

I personally started my Tableau certification journey last year (more like 18 to 20 months ago).  I was becoming much more heavily involved in my local Tableau user group and felt that I needed a way to benchmark or assess my skills.  I knew I had much more to contribute to my local community and I thought that going through the certification journey and sharing that with the TUG would be beneficial.

So I started by getting ready for the Desktop Qualified Exam.  Independently I went through all the existing documentation and searched for knowledge nuggets that would set me on the right path.  I took the stance of developing out all of my own knowledge gaining to a format that could be digested by a larger audience.  I walked through all of the practice exam questions and did an analysis of the different certification levels to the user group at least 3 times.

I passed the Desktop Qualified Associate around April or May of 2016.  It was a great achievement – and I was able to add definition to what that exam means.  Having the Desktop Qualified Associate certification means that you are technically proficient in the features and functions of Tableau Desktop.  It means that you can answer thoughtful questions using built in features and that you have depth of understanding best practices or efficient ways to get to the results.  If I were to equate it to a different skill – I would say that it means you know how and when to use different tools in a toolbox.  What it doesn’t mean: that you are a masterful architect or that you can build a stunningly beautiful home.

To get to the next level of mastery and understanding, that means you’ll need the Certified Professional.  If you take a look at the specific components that are tested you’ll quickly realize that advanced technical skill is weighted less than storytelling or analysis.  The purpose of the Desktop Certified Professional is to demonstrate that you have a deep understanding of data visualization, using data visualization to tell a story, and how level two (or three or four) analytics or analysis are necessary to start answering deeper and more important (read as: higher impact) questions.

For me to work on preparation here – the exam prep guide was only the beginning.  It assists in the structural components of 1) knowing how many questions there will be 2) estimating time available to spend on each question 3) examples of analytical/presentation depth required to demonstrate proficiency 4) variety of question types.

Probably the most intriguing questions for me are those where you have to assess a particular visualization, give and justify a critique (and specifically how it relates to a described objective) and then provide an alternative solution (also justifying verbally the importance).  This skill is much different than knowing how to hammer a nail into a post.  It is defending why you chose to put a porch on the northeast corner of a home.  It’s a lot about feel.

I had such an awesome time taking the exam.  There are a lot of real world constraints that required me to distill down the most important components of each question.  It’s interesting because for most items there isn’t a single right answer.  There are definitely lots of wrong answers, but right is a spectrum that is somewhat dependent on your ability to communicate out the completeness of your point of view.

I’ve had the title of Tableau Desktop Certified Professional for just over a year now – so I can tell you with a decent amount of retrospective thought what it has done for me.  Just as I am able to describe the test and purpose it served in this blog post – I can do the same thing in all of my interactions.  It keeps me humble to knowing that the PURPOSE behind a visual display is more important than fancy widgets or cool tricks.  That to a large extent my role is to work through the semantics of a situation and get to the root of it.  The root of the question or questions, the heart of concern, the why behind the visualization.  And also the artistry (yes I use this word) behind what it takes to get there.  We have all felt the difference between a perfectly acceptable visualization and the right visualization.  The end user experiences something different.  I firmly believe that deeper understanding can be achieved by spending that extra thoughtfulness and approach to iteration.

So let’s now fast forward to the other certification path – the more recent one: Tableau Server.  What’s interesting is that because my strengths have been built out on the visualization end, I haven’t planted myself in an opportunity to understand the deeper technical components of Tableau Server.  I have always understood and had great depth of knowledge in Site Administration.  That is to say acknowledging and abiding by best practices for sharing, permissions, and managing content.  But – the part that I had not spent time on is creating a sustainable platform to have the vision continuously executed.

So to overcome that minor blind spot – I went on a mission to learn more, to shine light on the unknown.  You’ve seen that play out here on my blog – going on a self directed adventure to deploy a Server on Azure.  Nobody told me to do that – I was internally compelled.  (I should also mention I was honored to have a friend go on the journey with me!)

I’m probably rehashing at this point – but anytime you grow knowledge in a particular area (more specifically technical) it gives you such breadth and depth of vocabulary to be able to connect to other individuals.  You find that communication barriers that were preventing the success of a project are diminished because you now speak the same language.  As I write this I can hear Seth Godin saying that the more knowledge someone has in a particular subject area the more ABSTRACT their language is around it.  Which means that it is extremely difficult to communicate with that SME unless significant effort is taken on the part of both parties to bridge the gap.

So that’s what Tableau Server qualification has done for me.  It’s the first step on a journey to get to the point where I imagine the next level to Server Certified Professional is the act of execution.  It’s less knowledge and verbiage and more tactical.  Also likely there’s more ambiguity – not a right answer, rather a spectrum of right where you communicate your why.

As I wind down this post – I must shout to you “go get certified.”  Ignore the naysayers.  It’s easy to not do something, but you know what is hard?  Doing something.  Being tested.  And why is that?  Because you can fail.  Get over failure – push through that mindset.  The alternative is much more pleasant and unlocks all the potential the universe has to offer.

 

Star Trek The Next Generation: Every Episode (#IronViz 3)

It’s that time again – Iron Viz feeder contest!  The third and final round for a chance to battle at conference in a chef coat is upon us.  This round the focus was on anything ‘Silver Screen.’

With a limitless topic I was certain that I would find myself in a creative rut that would likely result in submitting something at the end of the submission time period (August 13th).  So I am as shocked as anyone else that I have a fully formed submission way before deadline.

So what’s the topic and what got me unstuck?  Star Trek of course!  The backstory here is amazing – I went to a belated wedding shower for a few friends and they mentioned to me that they were going to the annual Star Trek convention.  And more specifically there was a special celebration occurring – the 30th anniversary of Star Trek: The Next Generation.  Not even up for debate – it just IS the best incarnation of the Star Trek universe.

So I decided to take a moment to do some research on finding TNG data.  It didn’t take me long to unearth this fantastic data set on GitHub that includes each episode’s script parsed out by character.

Really inspired by the thought of seeing each word of each episode visualized – I set forth on my mission.  As I got started there was one component that was mission critical: the bold colors present throughout the world of Star Trek.  The bold and moody colors of Star Trek are fantastic – especially paired with a black background.  And working with individual scripts meant that I could use color to accentuate different characters – much like their uniforms do in the episodes.

The next component that I wanted to invoke on this (again – design focused here) was the electronics and computer interfaces.  I particularly like the rounded edges and strong geometric shapes that are on the computer screens across all iterations of Star Trek.  So that describes most of the design – the choice of colors and how some of the visualizations were setup.

Now on to the next important component here: analysis.  When you see this visualization you may find yourself realizing that I don’t draw any conclusions.  For this collection of visualizations I am playing the role of curator.  I am developing a visual world for you to interact with, to go deep and wide in your understanding.  I am not attempting to summarize data for you or force conclusions upon you.  I am inviting you to come into the world of Star Trek, unearth who speaks during each episode, find out what that character is saying.  I want there to be an unending number of takeaways and perceptions generated from this.

And the last part you need to understand is the story telling.  This entire visualization has an untold number of stories in it by virtue of it being a visualization of the entire series.  If you want a meta-story to tell it’s simply this: Star Trek The Next Generation is such a deep and rich world that you should go get lost.  And while you’re on the path of getting lost do me a favor: retain some leadership tidbits from Picard and sprinkle in some logical takeaways from Data.

 

Let’s Break it Down: Tableau Server

Over the past month I’ve had the awesome pleasure of working on increasing my knowledge around Tableau Server.  As part of the process of learning more and advancing my knowledge I set out a goal of taking the Tableau Server Qualified Associate exam.  I’m really thrilled (and relieved!) to say that I’ve passed the test.  More than the cool credentials that come along with the certification is the vast amount of best practices knowledge and detail that I’ve gained along the way.

To aid in retention and sharing with the community – I figured it would be worthwhile to share some of the knowledge I curated along the way.  The certification itself is broken down into 5 components, so keeping with the theme – I figured I’d break apart my knowledge in the same way.

Preparation – 20%
Preparation is perhaps one of my favorite things to do in life.  I really like to dig into details and plan for what is going to happen.  It’s not uncommon for me to vocalize and visualize how I want something to work out.  It helps me to anticipate obstacles and roadblocks and formulate strategies on how to mitigate them.  So the fact that a huge chunk of sharing Tableau to the rest of an organization by way of Server focuses on the getting ready part is admirable.

My preparation began with Tableau Server: Everybody’s Install Guide.  Reading through this gives all skill levels an “in” on the different Server Administrative components at just the right level.  It’s also what I used to guide me through a single machine install on Azure.  Reading through this gives you a glimpse into the more detailed components that go into Server administration and starts to develop the relationships and understanding between those components.

Along the way you’ll learn about some of the most important things that go into preparation, namely the licensing model, authentication, and hardware requirements.  As quotes – here are my takeaways for each:

When it comes to licensing, what type of deployment are you planning?  Do you want to start or manage a set number of users or allow for a potentially unlimited number of users?  Do you anticipate needing to use the Guest account – a way to allow users to interact with visualizations without the need to log-in to the server?

When it comes to authentication – you need to get this right in the beginning.  Changing this once your server is deployed can get messy.  Do you want to use existing Active Directory and leverage those credentials or local authentication?  Local authentication may allow you to have more control over authentication, but it comes at the expense of more time administrating.

When it comes to hardware requirements – know what you’re getting yourself into.  There’s a huge difference between minimum requirements for a pilot server and recommended requirements.  You’ll also want to know how scalability and changing to a distributed environment are impacted.

Lastly one of my absolute favorite resources during this process is this article on Primary Server Installation Defaults.  It describes the process configurations and helps to dissect the way the server functions.  Marry that up with this article on Server Processes – the knowledge article that defines each process and helps you articulate each component’s function.

Some more anticipatory work should be done on data sources and the server machine itself.  You’ll want to ensure that your Run As User account has the right level of access to data sources and can perform all the necessary functions on the box.  As someone thinking about deploying Server and giving data to the masses (in a controlled, positive way) you’ll have to think through that component.

It can also lead into future-forward questions about what type of server you’ll want to tune toward.  The three choices in terms of Caching become a good topic of conversation and help to identify the path you may want to choose for your deployment.

caching_522x473

Once you get past the enjoyment of preparation come the exciting moment of installation and configuration.

Install & Configuration 30%
This part is the moment of truth.  You’re about to embark on the server journey and you truly are downloading the software and deploying it.  As you get started, hopefully some of the preparation tips you learned along the way can minimize any hiccups.

As you are greeted with the Server Configuration section you’ll want to access this article on how to configure SMTP setup.  The component that will allow you as an administrator to receive alert and failure emails.  It’s also the component that will allow end consumers to receive data-driven alerts and subscribe to views.  More visualization to the masses!

This is also where things start to bleed into the role of Site Administrator.  Being the server administrator means you’re the guy that sets up initial permissions for subsequent users.  You’ll need to put on your site administrator hat and ensure you understand the different levels of permissions and how content is managed.

You’re also bound to get a few questions about security here.  Both from the perspective of authentication and SSL and what those more on the business and data management side care about: access to data.  Read up on ways to limit data in a view from the desktop perspective here.

I also think it is really important to know and explore what a Site is.  The importance of this concept presents itself again when you think about how you’ll successfully implement Tableau Server so that the right people have access to the most relevant content.

You can even add a little flare during this initial configuration phase.  Customize the look and feel of Tableau Server by reading the Change the Name or Logo article.  I think you’re likely to get some street cred by personalizing the experience in the beginning of deployment.

Next up is…

Administration – 30%
After going through initial deployment comes the fun of full on administration.  I like this part a lot because it folds in the additional tools outside of the configuration window like tabadmin.  Tabadmin is one of those tools that will become your BFF as you begin to do things like check the server status, manually stop and start the server, and do more maintenance oriented tasks like backups and cleanups.

Other important components of hardcore administration at this point will be diving into the world of schedules.  At this point we should all know that Tableau Server has the capability to update data extracts, execute email subscriptions of views, and data-driven alerts.  The key here is that there is a certain amount of depth to developing, maintaining, and managing these schedules.  This knowledge stub starting with Extract Refresh Schedules breaks it all down.

And as you start thinking about potential performance concerns – consider how different scheduling scenarios and scheduling modifications could positively (or negatively) impact what your end users are experiencing.

There’s also Administrative Views built into the Server to help analyze what’s going on.  I really like the “Stats for Space Usage” view because it can be a springboard to a conversation on data governance.  A tangent: but I’ve seen the value of using this place to recognize you’ve got 6 copies of the same table embedded into workbooks on the server.  Time to try out a published data source perhaps?  You can also manage Desktop License Reporting as a server admin to answer questions like “does a specific user use their license?”

Really driving home the special powers that a Tableau server administrator has are the last two sections:

Troubleshooting – 13%
We all know that having the power of server administrator comes with the power of knowing why something is broken and aiding in resolution.  It can mean optimizing for traffic vs. optimizing for extracts.  And it can also mean super important things like knowing how to reset or add a Tableau Server administrator account.  In most of these circumstances there may be more than one way to go about doing the task – so take inventory of potential ways to do things.

Migration & Upgrade – 7%
Having Tableau Server is really cool and special – having the latest version that doesn’t break is even better.  And knowing how to get yourself out of the number one pickle the preparation step was trying to get you to avoid: Changing Authentication.  I also like this part because you get to see the list of new features and fixes that come with each new version.  Who doesn’t love a good set of what’s new and what’s changed.

Going through this process has helped cement in my mind some of the most important best practices when it comes to Tableau Server – I hope the same holds true for you!

 

Azure + Tableau Server = Flex

I’m affectionately calling this post Azure + Tableau Server = Flex for two reasons.  First – are you a desktop user that has always wanted to extend your skills in Tableau as a platform?  Or perhaps you’re someone who is just inherently curious and gains confidence by learning and doing (I fall into this camp).  Well then this is the blog post for you.

Let me back up a bit.  I am very fortunate to spend a majority of my working time (and an amount of my free time!) advocating for visual analytics and developing data visualizations to support the value it brings.  That means doing things like speaking with end users, developing requirements, partnering with application and database owners/administrators, identifying and documenting important metrics, and finally (admittedly one of the more enjoyable components) partnering with the end user on the build out and functionality of what they’re looking for.  A very iterative process to get to results that have a fair amount of communication and problem solving sprinkled in to pure development time – a lucky job.  The context here is this: as soon as you start enabling people to harness the power of data visualization and visual analytics the immediate next conversation becomes: how can I share this with the world (or ‘my organization’).  Aha!  We’ve just stepped into the world of Tableau Server.

Tableau Server or Tableau Online bring the capabilities to share the visualizations that you’re making with everyone around you.  It does exactly what you want it to do: via a URL share interactive and data rich displays.  Just the thought of it gets me misty-eyed.  But, as with any excellent technology tool it comes with the responsibility of implementation, maintenance, security, cost, and ultimately a lot of planning.  And this is where the desktop developer can hit a wall in taking things to that next level.  When you’re working with IT folks or someone who may have done something like this in the past you’ll be hit with a question wall that runs the entire length of every potential ‘trap’ or ‘gotcha’ moment you’re likely to experience with a sharing platform.  And more than that – you’re tasking with knowing the answers immediately.  Just when you thought getting people to add terms like tooltip, boxplot, and dot plot was exciting they start using words like performance, permissions, and cluster.

So what do you do?  You start reading through administration guides, beefing up your knowledge on the platform, and most likely extending your initial publisher perspective of Tableau Server to the world of sever administrator or site administrator.  But along the way you may get this feeling – I certainly have – I know how to talk about it, but I’ve never touched it.  This is all theoretical – I’ve built out an imaginary instance in my mind a million times, but I’ve never clicked the buttons.  It’s the same as talking through the process of baking and decorating a wedding cake and actually doing it.  And really if we thought about it: you’d be much more likely to trust someone who can say “yeah I’ve baked wedding cakes and served them” opposed to someone who says “I’ve read every article and recipe and how-to in the world on baking wedding cakes.”

Finally we’re getting to the point and this is where Azure comes into play.  Instead of stopping your imaginary implementation process because you don’t have hardware or authority or money to test out an implementation and actually UNBOX the server – instead use Azure and finish it out.  Build the box.

What is Azure?  It’s Microsoft’s extremely deep and rich platform for a wide variety of services in the cloud.  Why should you care?  It gives you the ability to deploy a Tableau Server test environment through a website, oh, and they give you money to get started.  Now I’ll say this right away: Azure isn’t the only one.  There’s also Amazon’s AWS.  I have accounts with both – I’ve used them both.  They are both rich and deep.  I don’t have a preference.  For the sake of this post – Azure was attractive because you get free credits and it’s the tool I used for my last sandbox adventure.

So it’s really easy to get started with Azure.  You can head over to their website and sign up for a trial.  At the time of writing they were offering a 30-day free trial and $200 in credits.  This combination is more than enough resources to be able to get started and building your box.  (BTW: nobody has told me to say these things or offered me money for this – I am writing about this because of my own personal interest).

Now once you get started there are sort of 2 paths you can take.  The first one would be to search the marketplace for Tableau Server.  When you do that there’s literally step by step configuration settings to get to deployment.  You start at the beginning with basic configuration settings and then get all the way to the end.  It’s an easy path to get to the Server, but outside of the scope of where I’m taking this.  Instead we’re going to take the less defined path.

Why not use the marketplace process?  Well I think the less defined path offers the true experience of start to finish.  Hardware sizing through to software installation and configuration.  By building the machine from scratch (albeit it is a Virtual Machine) it would mimic the entire process more closely than using a wizard.  You have fewer guard rails, more opportunity for exploration, and the responsibility of getting to the finish line correctly is completely within your hands.

So here’s how I started: I made a new resource, a Windows Server 2012 R2 Datacenter box.  To do that, you go through the marketplace again and choose that as a box type.  It’s probably a near identical process to the marketplace Tableau Server setup.  Make a box, size the box, add optional features, and go.  To bring it closer to home go through the exercise of minimum requirements vs. recommended requirements from Tableau.  For a single-node box you’ll need to figure out the number of CPUs (cores), the amount of RAM (memory), and the disk space you’ll want.  When I did this originally I tried to start cheap.  I looked through the billing costs of the different machines on Azure and started at the minimum.  In retrospect I would say go with something heavier powered.  You’ll always have the option to resize/re-class the hardware – but starting off with a decent amount of power will prevent slow install experience and degraded initial Server performance.

Once you develop the resource, you literally click a button to boot up the box and get started.  It took probably 15 to 20 minutes for my box to initially be built.  More than I was expecting.

Everything done up to this point it to get to a place where you have your own Tableau Server that you can do whatever you want with.  You can set up the type of security, configure different components – essentially get down to the nitty gritty of what it would feel like to be a server administrator.

Your virtual machine should have access to the internet, so next steps are to go to here and download the software.  Here’s a somewhat pro tip.  Consider downloading a previous version of the server software so that you can upgrade and test out what that feels like.  Consider the difference between major and minor releases and the nuance between what the upgrade process will be.  For this adventure I started with 10.0.11 and ended up upgrading to 10.3.1.

The process of the actual install is on the level of “stupid easy.”  But, you probably wouldn’t feel comfortable saying “stupid easy” unless you’ve actually done it.  There are a few click through windows with clear instructions, but for the most part it installs start to finish without much input from the end user.

You get to this window here once you’ve finished the install process.

This is literally the next step and shows the depths to which you can administer the platform from within the server (from a menu/GUI perspective).  Basic things can be tweaked and setup – the type of authentication, SMTP (email) for alerts and subscriptions, and the all important Run As User account.  Reading through the Tableau Server: Everybody’s Install Guide is the best approach to get to this point.  Especially because of something I alluded to earlier: the majority of this is really in the planning of implementation, not the unboxing or build.

Hopefully by this point the amount of confidence gained in going through this process is going to have you feeling invincible.  You can take your superhero complex to the next level by doing the following tasks:

Start and Stop the Server via Tabadmin.  This is a great exercise because you’re using the command line utility to interact with the Server.  If you’re not someone who spends a lot of time doing these kinds of tasks it can feel weird.  Going through the act of starting and stopping the server will make you feel much more confident.  My personal experience was also interesting here: I like Tabadmin better than interacting with the basic utilities.  You know exactly what’s going on with Tabadmin.  Here’s the difference between the visual status indicator and what you get from Tabadmin.

When you right-click and ask for server status, it takes some time to display the status window.  When you’re doing the same thing in Tabadmin, it’s easier to tell that the machine is ‘thinking.’

Go to the Status section and see what it looks like.  Especially if you’re a power user from the front end (publisher, maybe even site administrator) – seeing the full details of what is in Tableau Server is exciting.

There are some good details in the Settings area as well.  This is where you can add another site if you want.

Once you’ve gotten this far in the process – the future is yours.  You can start to publish workbooks and tinker with settings.  The possibilities are really limitless and you will be working toward understanding and feeling what it means to go through each step.  And of course the best part of it all: if you ruin the box, just destroy it and start over!  You’ve officially detached yourself from the chains of responsibility and are freely developing in a sandbox.  It is your chance to get comfortable and do whatever you want.

I’d even encourage you to interact with the API.  See what you can do with your site.  Even if you use some assisted API process (think Alteryx Output to Tableau Server tool) – you’ll find yourself getting much more savvy at speaking Server and that much closer to owning a deployment in a professional environment.

27 Weeks of #WorkoutWednesday

27 weeks into 2017 means 27 weeks of #WorkoutWednesday.  So it is time to do some reminiscing on the experience and providing some commentary on the profound effect it has had on me.

At the end of 2016 something was abundantly clear to me: I wasn’t as fluid as I could be and I didn’t fully understand the limitless possibilities that Tableau as a tool had.  I have this very abstract concept of how working with Tableau should be for me as an individual: the canvas and tools for my artwork.  And to be honest, that sounds kind of silly coming from someone like me – but I believe it.  When I talk to people I tell them that I’m a data communicator.  I like to see what is in data and share it with the world.  More specifically: I like to share an organized view of several data points so that the end consumer can go exploring and see the beauty of the individual points.

Getting back to the point: this means that I should be capable of wielding the full depth of Tableau.  I wanted to have the ability to orchestrate anything.  I felt it was necessary.  I wanted the feeling of flow to extend.  I didn’t want my creativity to be limited by my lack of practice or time on the clock.

So I set out a few goals for myself for 2017 – 2 that I’ll share, and 1 that this post is about.  The 2 related to my professional development are pretty similar: participate in every #MakeoverMonday and participate in every #WorkoutWednesday.  Show up, do the work, share the results with the global community, and see what happens.

So here we are: 27 weeks into the year.  What has participation done for me?  It’s not enough to say that my skills have grown exponentially.  My confidence and ability to connect with individuals has also grown tremendously.  One thing I did this year in addition to this participation was to facilitate going through the results every month at the Phoenix Tableau User Group.  A critical component from my perspective: communicating out the “why” behind the build along with the “how.”  With two main goals: I would be forced to do the work consistently (selfish) and the other being that the Phoenix Tableau community would benefit and grow from the knowledge share.

Now that the foundation (context) has been set – I’d like to go through each individual workout and share its impact on me.

Week 1: Comparing Year over Year Purchasing Frequencies
I remember this one vividly because it’s the first.  There were two things in this particular Workout that I’d never done.  The first was to use the day of the year in a visualization.  The next was to have dynamic marker points based on a parameter.  One thing that was interesting about this was that I had a sense of how to do the running total calculation because of a Tableau blog post on the “top table calculations.”  Going through this workout was humbling.  It took me a significant amount of time, more time than I thought it should.  It was also the beginning of a now pattern ritual.  I know that I spent a lot of time verbalizing my problem solving process and trying to get to a solution.  And I also remember the sweet satisfaction of solving and posting it.  I was hooked after the first one.

Week 2: Showing Nothing When ‘All’ is Selected
I was really thankful for this week.  There were several things that I already knew how to do, mostly with the custom shapes and how to not show something when ‘All’ was selected.  What I didn’t know how to do well was deal with dashboard titling when ‘All’ was selected.  My past attempts usually landed me in the world of ATTR aka * land.  So going through this challenge really helped me stop and process something that I previously stumbled over.  I got an amount of confidence out of this week because it took less time than the first.

Week 3: The State of U.S. Jobs
Ah I loved this one.  Small multiples are fascinating to me.  And Andy’s blog post gave me the freedom to end up with lots of sheets – he gave mention that it wasn’t a trellis chart and I was immediately relieved.  There was a lot of formatting in this one – some really interesting tricks on how to display things that I learned.  And one that I continue to take with me is this: change row or column borders to thick white to add some padding.  I know when I downloaded Andy’s solution he had 50 sheets, I had 10.  This workout ignited something in me and I made a similar visualization regarding high school graduation rates in Arizona.

Week 4: Common Baseline Charts & NFL QBs
I really liked this Workout from a visual perspective.  I like showing massive amounts of data and then giving someone control over what is the most prominent.  This was also the second visualization that shared with me how you can use running total and baselines to show differences between categories.  This type of visualization is now something I often develop at work.

Week 5: The Distribution and Mean of NFL Quarterbacks
The math nerd inside of me loved this one.  I used to be a huge geek for box plots and I always think showing distributions of things in a visual format is very easy to interpret.  I get this mental image of looking down on a histogram and the fact that this one had the median opposed to the mean got me really jazzed.  I also remember feeling super cool because I successfully flipped the axis labels for the year to the top using a random tip ala Jonathan Drummey.  I also like this one because I had to download fonts from Google Fonts – a resource I didn’t even know was out there.

Week 6: UK Population Predictions – Trellis Butterfly Chart
The appearance of the word trellis had me cringing.  Looking at the visualization had me intrigued.  There was a LOT of depth in this one.  Knowing there was comparison to a national average and knowing that there was multiple dual-axis charts PLUS bake on the trellis component had me concerned.  You know what ended up being the worst part for me on this one?  The year labels and the tooltips.  Each LOD in that tooltip was a validation point I had to go through to determine if my calculation was accurate.  This workout made me appreciate reversing axes.

Week 7: Dynamic Trellis Chart
It finally happened.  I couldn’t fake a trellis chart anymore and hard code different row & column locations – I had to use the capabilities of Tableau to achieve.  More than that was some very sophisticated labeling that I just couldn’t get right for the life of me.  This is the only one that I gave up on.  I couldn’t figure it out and I was a little too prideful to download Andy’s workbook and USE his calculations.  I definitely downloaded and digested the process, but I didn’t feel it was authentic to me to finish the exercise – I was beaten this week.

Week 8: Marimekko Makeover
I thought this one was going to be cake.  I thought it was going to be a cake walk because I had briefly thumbed through an article about Tableau 10 and the ability to make these types of charts.  I was wrong.  The way the data was structured made it more complex.  I shared this one at the Phoenix Tableau User Group and the whole time I was concerned that the “table calculation magic” may not be repeatable.  We made that Marimekko chart.

Week 9: World Series Game 7: Pitch-By-Pitch
I love this visualization.  I love how granular it is.  I love how abstract it is.  I love that there is color and shape encoding and even negative and positive positioning.  I also really like using characters within text to denote what is being seen in a visualization – all clever things that I do now.  As I look back I remember the one sore spot for me that I decided not to correct for: the open “reached base” shape.  I didn’t put white in the middle.  Looking back I should have – I was being lazy.  I knew how to do it and that it was the right thing to do to get it “to spec.”  But the lazier side of me won out and let it go.

Week 10: Exploring UK House Prices
This one I knew I would need help on.  I’d never made a hexbin map and I didn’t know where to start.  What’s surprising is that it’s not overly complicated.  I didn’t realize that there were built in hexbin functions.  I thought there was some deep crazy skill going on anytime I saw these.  Walking through this exercise made me change my tune.  This was also an important growth week for me.  I started getting more comfortable with the idea that it wasn’t “cheating” to use community made resources as help and guidance.  Instead I was using them for their rightful purpose.

Week 11: Full Year Calendar with Month Labels
This one has another interesting story.  I completed it last weekend (the 26th week) opposed to the 11th week.  So how did that happen?  Well I remember starting it and getting stuck.  I couldn’t figure out 2 things to begin with: how to get the dates in order (which sounds really lame) and how to deal with the month labels.  This was also right around the time where I changed jobs and was trying to finish my MBA.  I think the challenge this one presented exhausted me from a mental perspective.  Week 11 was the start of my workout break (check out my tracker to see the details).  Once I completed it though, I was very pleased with the results.  I made a conscious decision to go a different path with the month labels and embed them into each month’s calendar.  I really like that I’m now comfortable going off spec and not feeling like I’m not living up to the challenge.

Week 12: Highlight a Treemap
I’ll admit it, this one was simple for me to do.  When I came back after my mental break and did this one, I laughed at why I hadn’t done it sooner.  I appreciate the simplicity of this one in development and the impact it has on making the end-user’s experience so much more pleasant.

Week 13: Benford’s Law
Another straightforward week for me from the development perspective.  When I completed this I started to realize that I know a lot.  I know how Andy typically develops and what his tricks are.  I know how to take something displayed and translate it into Tableau.  This is a workout I completed on 6/3/17.  Six months after embarking on the #MakeoverMonday and #WorkoutWednesday challenge.  The immersion was paying off in spades.

Week 14: UK Exports Pareto
I didn’t complete this one on time, but relatively close to its original time period.  I ended up sharing this one at the Phoenix Tableau User Group.  The first job I ever had at “analyzing data” I was asked during an interview to build a Pareto chart in Excel.  I memorized how to do it because I couldn’t describe the technical mechanics.  That was more than 3 years ago and feels like an eternity.  Today pareto charts are still some of the most engaging and useful visuals that I use when trying to assess a problem.

Week 15: How many times has a team been top of the Premier League?
Okay – this was just a community challenge with no hidden agenda.  One designed from my perspective to test and share the difference between table calculations and level of detail expressions.  I remember completing this and realizing that life before LODs must have been terrible.  And that there are some extremely talented problem solvers and thinkers out there who can develop solutions using the tools they have.

Week 16: Should I Buy Tableau Shares?
I remember this one vividly because it mirrored something I was doing at work.  It was a different take on a visualization I was trying to get people to accept.  I appreciated seeing window calculations for statistical values being present and giving users input flexibility.

Week 17: Product Spread Treemap (Part 1)
Intentionally named Part 1 – this one made me recognize the funny mechanics that Tableau has.  They’re really obvious when you make a treemap.  Just test it out and you’ll see that the order of pills on the marks card determines how the visual will be generated.  It also taught me an important lesson: sometimes I over complicate things.  Pre-build I had imagined the colored texts as separate worksheets, going through the build I was humbled realizing it could be one.

Week 18: Appearing and Disappearing Vizzes
This one also made an appearance at the Phoenix Tableau User Group.  And to be perfectly honest, Emma’s topics are usually much more practical for the group.  I took this one as an opportunity to explore between tiled and floating layouts.  When I demoed this to the TUG everyone loved it.  I know several users who took this back to their professional lives.  Thank you Emma.

Week 19: Product Spread Treemap (Part 2)
The agony of this one.  Andy mentioned it was going to be tough and it was.  I had a sense that there was trickery involved because of the automagic nature of treemaps seen in Week 17.  The spoiler on this one: the boxes are different sizes.  This one also made an appearance within our user group at our Saturday Viz Club.  4 members got together and collaborated on trying to build it out and downloading Andy’s solution.

Week 20: Comparing Regions
Perhaps a more appropriate name would building out bar + line charts all in one view with the bars next to each other.  This one was damning to me.  It took me a long time to parse out the ‘aha’ factor and put it into action.

Week 21: NCAA Final Score-by-Score
This was another great challenge, do everything in a single worksheet.  The biggest challenge here was the data structure.  I think if I had taken time to restructure the data set that it would have been easier to develop – but being who I am, I took it on as part of the challenge.  I realized when I finished this one that I did it a different way than Andy because I had dots everywhere and no attribute stars *.  I kind of feel like it makes mine more complete.

Week 22: Wine Tasting is Harder Than it Looks
Guess what – this was also presented at our user group!  What’s great about this is the challenges that the community faced as we built it together.  When asked before the build, most had never thought to make a visualization of this type.  When participating in the build the color legends were a huge curve ball.  Even the most tenured individuals didn’t think to make the color legend an actual sheet.  I also had a colleague tell me that he didn’t realize you could drag headers in the view to change their order – he thought that was life changing.

Week 23: National Parks Have Never Been More Popular
Simply a stunning visualization to recreate.  A bump chart, vivid use of color and text color matching line color.  I love this visualization.  I shared this on LinkedIn and got reactions from so many people.  I know it is something that has been imprinted on many many people.

Week 24: Visualising the National Student Survey with Spine Charts
I wrote a blog post on this one, so there’s a lot there.  This one is still pretty fresh in my mind.  The biggest things regarding this one are related to the mathematics under the hood.  At the way numbers can do funny things.  How at the end of this exercise I opened my version, Emma’s version, and Andy’s version and we all had different numbers for the same question response.  And we could all defend equally the reasoning and logic behind how the number was derived.

Week 25: The Value of Top 3 & Top 5 Contributors
This taught me so much about table calculations.  I use them in basic ways on a daily basis – this workout takes them to another level.  I had never thought to use a table calculation to limit the number of members within a dimension from being displayed.  Once I did it – it made perfect sense.  The order of Tableau filters immediately came to my mind.  I am still in awe of the depth and thoughtfulness here.

Week 26: UK General Election 2017 Results
Another dynamic Trellis chart – no no no.  I do not like these!  I like the presentation and layout, the slope charts, the way they look like ladders.  I like the reference lines.  I don’t like dynamic trellis.  I am not convinced that the approach to dynamic trellis can be let loose in the wild – it needs some supervision.  Comparing mine to the original I noticed how easy it was for data points to be indexed into wrong blocks.

Week 27: The Quadrant Chart
As if by fate this week’s workout resonated deeply with a visualization from my history.  More than a year ago I made a quadrant chart regarding wage gaps.  I really like that Andy took the time to color the tool tips to add effect.  It demonstrates something that I now know to be true: duplicating and iterating off of a sheet or a calculated field is something you should be doing often.  Copy and paste is your friend.  Duplicate is music to my ears.

Cheers to 27 weeks – I’m on board for the rest of the year.  As I alluded to, I made a progress tracker on my Tableau Public (and also on this site) to keep myself accountable.  While I can’t guarantee it will be done in the same week, I can say with a true heart my intention is to complete the year at 100%.

If you haven’t started the adventure of the workouts, or if you’ve done a few – I strongly encourage you to take a Saturday afternoon and go through the exercises.  Don’t look at them and lazily say “oh I could totally do that.”  DO THE WORK.  It will help you grow tremendously, unearth skill gaps, and unlock your creativity.  Thank you Andy & Emma.

#IronViz – Let’s Go on a Pokémon Safari!

It’s that time again – Iron Viz!  The second round of Iron Viz entered my world via an email with a very enticing “Iron Viz goes on Safari!” theme.  My mind immediately got stuck on one thing: Pokémon Safari Zone.

Growing up I was a huge gamer and Pokémon was (and still is) one of my favorites.  I even have a cat named after a Pokémon, it’s Starly (find her in the viz!).  So I knew if I was going to participate that the idea for Pokémon Safari was the only way to go.

I spent a lot of time thinking about how I might want to bring this to life.  Did I want to do a virtual safari of all the pocket monsters?  Did I want to focus on the journey of Ash Ketchum through the Safari Zone?  Did I want to focus on the video games?

After all the thoughts swirled through my mind – I settled on the idea of doing a long form re-creation of Ash Ketchum’s adventure through the Safari Zone in the anime.  I sat down and googled to figure out the episode number and go watch.  But to my surprise the episode has been banned.  It hasn’t made it on much TV and the reason it is banned makes it very unattractive and unfriendly for an Iron Viz long form.  I was gutted and had to set off on a different path.

The investment into the Safari Zone episode got me looking through the general details of the Safari Zone in the games.  And that’s what ended up being my hook.  I tend to think in a very structured format and because there were 4 regions that HAD Safari Zones (or what I’d consider to be the general spirit of one) it made it easy for me to compare each of them against each other.

Beyond that I knew I wanted to keep the spirit of the styling similar to the games.  My goal for the viz is to give the end user an understanding of the types of Pokémon in each game.  To show some basic details about each pocket monster, but to have users almost feel like they’re on the Safari.

There’s also this feeling I wanted to capture – for anyone who has played Pokémon you may know it.  It’s the shake of the tall grass.  It is the tug of the Fishing Pole.  It’s the screen transition.  In a nutshell: what Pokémon did I just encounter?  There is a lot of magic in that moment of tall grass shake and transition to ‘battle’ or ‘encounter’ screen.

My hope is that I captured that well with the treemaps.  You are walking through each individual area and encountering Pokémon.  For the seasoned Safari-goer, you’ll be more interested in knowing WHERE you should go and understanding WHAT you can find there.  Hence the corresponding visuals surrounding.

The last component of this visualization was the Hover interactivity.  I hope it translates well because I wanted the interactivity to be very fluid.  It isn’t a click and uncover – that’s too active.  I wanted this to be a very passive and openly interactive visualization where the user would unearth more through exploring and not have to click.