Author: Ann Jackson

  • Installing Tableau Server on Linux – Tableau 2021.1 Edition

    Installing Tableau Server on Linux – Tableau 2021.1 Edition

    It’s been over 2 years since we wrote our original blog post on installing Tableau Server on a Linux machine, to date it remains our most trafficked blog post. Since Tableau has continued to release new versions, we decided it was time to update our blog to reflect a new deployment.

    Just like before, we’re starting with a fresh OS installation, still using Ubuntu LTS 16.04 (hey, it’s LTS for a reason!). We’ve upgraded our hardware, this time we’re installing on an actual data center server, an HP ProLiant ML350 Gen9 8-Port, with the following specs:

    • (2) 2.6 GHz 8 Core Intel Xenon Processors with 20MB Cache (e5-2640v3)
    • 128 GB Memory PC4-17000R (8 x 16GB sticks)
    • 250 GB SSD

    Tableau Server 2021.1 just released, so we’re installing the latest and greatest version. Since we’re on a Debian like distribution of Linux, we’ll use the .deb file type.

    We still like following along to Everybody’s Install Guide that Tableau makes available. This is great for an IT generalist or someone doing a POC installation of Tableau Server. It gives you start to finish the steps you need to take and links out to many important knowledge articles along the way.

    Before you get started, make sure the user you’ll be doing the installation with on the Linux machine can sudo – meaning it can perform operations like root. This will be necessary throughout the course of the installation. You’ll also want to do a general update on the OS.

    sudo apt-get update

    If you’re following along with the guide mentioned above, Step 1 of the deployment is to install the Tableau Server package and start Tableau Services Manager (TSM). Since we’ve got a version of Linux with a GUI, we did this by downloading from the webpage. If instead you are downloading onto a headless server, you’ll want to install curl and use it to download the installer. Alternatively you can use wget.

    sudo apt install curl
    curl -O https://downloads.tableau.com/esdalt/2021.1.0/tableau-server-2021-1-0_amd64.deb
    sudo apt-get install wget
    wget https://downloads.tableau.com/esdalt/2021.1.0/tableau-server-2021-1-0_amd64.deb

    Depending on where you are within the terminal, you may want to navigate to a different folder before downloading the file. After you download the installer, but before you execute it, you’ll want to make sure you’ve got gdebi-core installed.

    Now we’re all set and ready to actually install Tableau Server! In your terminal navigate to the folder where the file was saved.

    cd Downloads
    sudo gdebi -n tableau-server-2021-1-0_amd64.deb

    From here, you’ll open the package and unpack Tableau Server. Tableau does you a solid and will provide the exact location and command to run the installation script. Don’t forget that tab-complete is your friend in the terminal.

    sudo /opt/tableau/tableau_server/packages/scripts.20211.21.0320.1853/initialize-tsm --acceptaeula

    Tableau will now begin the initial installation. This happens in 2 steps, first it will go through a short process to initialize, then you’ll be prompted to continue the install either via the TSM GUI (servername:8850) or via TSM command line. It even reminds you what your initial user credentials should be for the next step (which are typically the same as the user you’re logged in as) and what the default URL for the server is.

    If you’re working in the TSM GUI (from browser), now is the time to go to the TSM page. Tableau Server generates a self-signed SSL certificate when it initializes, so you may see an untrusted message in your browser. You can go ahead and bypass this error message to login to TSM.

    Remember, your user name and password used to log in to the machine are what you’ll enter here. The time to enter users will come after you decide which Identity Store method you’ll be using.

    You’ll be prompted to register the product, and then get hit with 4 immediate configuration requests. Identity Store is the most serious setting on this page, because once you set it, you can’t change it. For our deployment we’ll be using Local (meaning we’ll create user names and the Server will manage passwords) authentication. If instead you wanted to do Active Directory (or another LDAP), selecting that option will prompt you to fill in the name of the AD domain.

    If you’re unsure of any of these initial settings, remember you can hover over the section to get a nice paragraph from Tableau about the setting’s purpose. They also have a link at the bottom for the Administrator’s Guide.

    For this next part, go ahead and make yourself a cup of coffee, because this is the longest part of the install. Tableau will go through initializing several components, including setting the initial topology. Depending on the hardware you’re running, this can take anywhere from 10 to 30 minutes.

    Once this step completes your Server is nearly up and running. The webpage should prompt you to create your first Administrator account. If you’re using Local Directory, you can use any username you’d like (for simplicity we’re repeating the same username). If you’re using Active Directory, you’ll have to pick a user ID associated with the domain. The password for AD will be slightly different, instead of requesting you to generate one, you’ll simply be prompted to enter your password.

    Once you create an administrator account, you’ll be immediately logged into the Server environment (in fact you can see in the screenshot above, it opens a new tab for the server and keeps TSM up).

    Now, because it’s a Linux installation, as a final step you’ll want to download and install the drivers for PostgreSQL. Remember that Tableau Server uses PostgreSQL as the backend to store all of your content, so you’ll need to install the driver to see the Administrator views (located in Server Status).

    New with 2020.4 +, is a new version of the PostgreSQL database. In these newer installations, you’ll have to add a JDBC (previously we would use an ODBC) driver to connect to PostgreSQL. So make sure you navigate over to the Drivers Download page Tableau provides. At time of writing, Tableau linked to the following driver: https://downloads.tableau.com/drivers/linux/postgresql/postgresql-42.2.14.jar. If you’ve got a GUI you can use, go ahead and download it from the page – otherwise use curl or wget to download the .jar.

    curl -O https://downloads.tableau.com/drivers/linux/postgresql/postgresql-42.2.14.jar
    wget https://downloads.tableau.com/drivers/linux/postgresql/postgresql-42.2.14.jar

    Final steps are to create and drop the driver into /opt/tableau/tableau_driver/jdbc, which Tableau mentions you may have to manually create. We did have to create it, so here’s the code snippet. Make sure you’re at the root when you try to navigate to /opt/tableau. This is also a protected folder, so you’ll need to sudo to create the new directories.

    cd /opt/tableau
    sudo mkdir tableau_driver
    cd tableau_driver
    sudo mkdir jdbc

    And finally, copy the file into the new directory you just created.

    sudo cp postgresqul-42.2.14.jar /opt/tableau/tableau_driver/jdbc

    After we dropped the JDBC driver, our Server install still wasn’t loading the visualizations for the Admin views. So we went ahead and restarted the Tableau Server. That immediately cleared up the issue and we could see our admin views!

    And that’s it – installation is complete! There are definitely more customizations and configurations we’re sure you’ll want to implement, but pause for a moment and rejoice in setting up a platform for people to interact with their data.

  • Dear Data 2019 – Week 10, Data Pals

    Dear Data 2019 – Week 10, Data Pals

    Week 10 postcards are finally here for the data postcard project Sarah Bartlett and I are working on. The topic for the week was our relationship with each other. How much do we communicate, how often, when, and so on. And the timing was fantastic – Sarah and I recorded a video feedback session for #IronQuest ala business dashboards. That meant in addition to our normal back and forth we were preparing for the recording (and as always monitoring the latest when it comes to the postcards).

    I chose to approach the week by focusing on the time difference between us. Sarah and I use WhatsApp to text each other and after EVERY message there is a timestamp. So this got me very curious as to what the timestamp says on Sarah’s phone. I compiled all of our texts at the end of the week and jotted down their time (in my time) in a Google Sheet. Then I did the math to offset the time and began plotting.

    Dear Data 2019 – Ann’s week 10, data pals
    Dear Data 2019 – Ann’s week 10, data pals (legend)

    To demonstrate the difference in when we were talking to each other I chose to create two different time axes with the days of the week starting at the center line (Sunday) and moving outward toward Saturday. I was hoping to see a common pattern emerge around when we typically talk to each other. In reality it seems like independent of time of day we are both pretty responsive (Sarah did tell me to go to bed once). There’s one very small pattern that would probably emerge over more time – notice that Sarah messages me around noon and then I wake up in the morning and reply.

    You’ll notice that the viz itself is full of corrections – I spent a ton of care crafting out a precise grid for hours and days – but fell short on labeling and that in turn influenced where I put a few data points. I also realized after I mailed it that there are times when it’s the next day for Sarah vs. me – so at best the labels for days of the week represent the days from my perspective.

    Ending on a high note I created a bonus viz of emojis. My original idea was to add them as flourish to the dots, but because plotting them correctly proved difficult, I left it to a simple bar chart. Maybe a ratio would have been more effective? Ann uses 4 emojis per every 1 emoji used by Sarah.

    Here’s Sarah’s postcard:

    Dear Data 2019 – Sarah’s week 10, data pals
    Dear Data 2019 – Sarah’s week 10, data pals (legend)

    Once again Sarah delights! She’s clearly on a path of taking things to a new place of abstraction and using other objects as visual metaphors. Not only that, but she also captured MUCH more data about us – our interactions on Twitter, our postcards, and our #IronQuest feedback call. AND she included the number of cat emojis on WhatsApp. (!!) And amazingly enough – if you didn’t know it was data-driven – you’d assume it was just a sketch.

    There’s more to come soon – so keep an eye out! And in the meantime make sure you check out Sarah’s take on the topic.

    I really do hate how rusty these mailboxes are.

  • Dear Data 2019 – Week 9, To Do Lists

    Dear Data 2019 – Week 9, To Do Lists

    Week 9 of the data postcard project Sarah Bartlett and I are working on have arrived. The topic of the week was To Do Lists. I don’t actively keep good lists of tasks – they make me sad – especially when I have tasks that continue to go undone. I also think they miss describing/capturing how priorities can change. So instead of starting and attempting to track a to do list, I instead chose to track what it is I’m doing.

    I’ll be more clear – because of my line of work I track all of my work using an app. It’s very detailed to include project specifics. I figured it would be interesting to see how my attention gets divided in a given week. From a data collection perspective this meant I didn’t have to do anything different – the data I have is already tracked and the behavior already exists.

    So at the end of the week how did my schedule look?

    Dear Data 2019 – Ann’s week 9, to do lists
    Dear Data 2019 – Ann’s week 9, to do lists (legend)

    I ended up creating a classic Gantt chart showing Monday through Saturday. The bars are positioned based on time, with the far left being 6 AM and the far right being 9 PM, the width of the bars is duration (in time) and the color of the bars represents the projects I was working on.

    I will say this was a particularly busy week for me, as you can tell, I like to have 2 major projects going (red & orange) at a time with a significant amount of time spent on keeping the business running (the lighter blue). There’s some obvious bleed through of other projects this week, so it’s interesting to see how and when they get integrated.

    All the white space is non-clocked time. Depending on where I am, I usually find myself getting up for 5 to 10 minutes and taking a break to reset or switching to a different task. I enjoyed seeing this week illustrated, because it is a good reminder that a structured 9 to 5 schedule isn’t very realistic. It’s much more fluid, with similar start and stop times for each day, but with small gaps driven by the tasks being worked on.

    Sarah on the other hand chose to lean in on creating to do lists, here is here postcard:

    Dear Data 2019 – Sarah’s week 9, to do lists
    Dear Data 2019 – Sarah’s week 9, to do lists (legend)

    She admits, as I have, that to-do lists aren’t really her thing (which I rejoiced!). Immediately what’s interesting about her list is that it is dominated by personal tasks. For what I am assuming is a normal work week only about 25% of her to-dos fall in the work bucket. And I’d even say that she gets a large majority of her tasks done within the day of when she sets them up. I also like the nuance of metallic silver she’s added for aging tasks – a bit hard to see in the photo, but they add tremendous design effect in person.

    I’m noticing an emerging trend as we continue to create the postcards. Sarah has gone further down using other objects as abstract representations (she herself called the triangles flags) whereas I am mostly still in a more direct mode of creating charts).

    In real time we’re on week 13 – so there’s more to come and more mail to catch up with the progress we’ve made.

    As always, don’t forget to check out Sarah’s blog on the topic.

    Mailed on a Monday evening!
  • Dear Data 2019 – Week 8, Phone Addiction

    Dear Data 2019 – Week 8, Phone Addiction

    After the terrible flurry of complaints, Sarah and I continued on with week 8 of the dear data postcard project we’re conducting. Week 8 was a welcome change, tracking how often we use our phones.

    I was excited to jump on this topic to know more insight into what I’m doing on my phone (although there is also screen time now). To track the data for the week, I created IFTTT buttons that identified the first reason that I picked up my phone. This allowed me to keep track of the time, category, and also add on a more detailed reason.

    Because I chose to capture only the first thing I did on my phone, I feel that the data well represents that, but may miss out on additional tasks or items I was doing after I unlocked my phone. It may be better to say that these items were what caused my attention to be diverted to my phone OR were a necessary task to be done (mapping/music) via my phone.

    Dear Data 2019 – Ann’s week 8, phone addiction
    Dear Data 2019 – Ann’s week 8, phone addiction (legend)

    This postcard has the most detail from me to-date. After tracking the data, I was really only able to whittle it down to 15 different distinct categories. I felt that any further combining would ruin the detail of the data (and I didn’t like that I had to put flashlight and calculator together).

    Each segment of the line represents one usage and the time. It is almost like a running total chart with the lines connected at the points in time for the day. It was the best way I could think to figure out how often I use my phone for something and when (ex: is it all day or only in the morning). You’ll notice that this is the second week where I’ve started to use a ruler and pencil to draw out my visualizations in advance – I’m getting much more precise with what I want to convey.

    As no surprise and apparent by the back side, texting, email, and social media tend to make up the majority of my phone time. I was surprised by a lot of the smaller things that I don’t think about, but only do on my phone – in particular shopping, which includes both grocery shopping at the store and online shopping.

    I forgot to check my screen time at the end of the week, so here’s the most recent 7 days (does not align with my postcard, but should be good for additional context).

    Twitter dominates Wednesday & Thursday due to Workout Wednesday

    Here’s Sarah’s postcard for the week:

    Dear Data 2019 – Sarah’s Week 8, phone addiction
    Dear Data 2019 – Sarah’s week 8, phone addiction (legend)

    My first reaction when I saw this postcard was just amazement that Sarah was able to create such a beautiful picture with her data. The choice of colors, dots, and final shapes are so pretty. Then of course I’m immediately drawn to noticing that her social media habit picks up dramatically on the weekend (no surprise there), as does her usage of entertainment apps.

    In short, Sarah managed to take a topic that we both probably don’t feel the best about and portray it in a beautiful way!

    And that’s a wrap on this week. I really enjoyed this one, both from the two visualizations we made, to tracking and recognizing what I use my phone for. It’s not all evil (social media), there are lots of little things I depend on it for – including mapping, music, calculator, a time – the list goes on. So while it may be most known for communicating with others, it really does serve it’s purpose to help me in all facets of my life.

    At the post office again!

    Don’t forget to check out Sarah’s take on the week!

  • Dear Data – Week 7, Complaints

    Dear Data – Week 7, Complaints

    Week 7 postcards have long been delivered and this blog post is overdue. As if the subject for the cards had some influence, the theme of complaints seemed to have an extremely negative impact on having the desire to write the companion blog post.

    During this week I tried to track all of my verbal complaints or times when I felt actively frustrated or annoyed. I genuinely try not to complain very often, so most of my tracked complaints represent high amounts of escalated annoyance or dissatisfaction.

    For data collecting I documented all of these moments on my phone, writing a small sentence that expressed the complaint to document the subject and frustration level. In retrospect I think capturing this data wasn’t very accurate and it seemed to me that the more complaints I tracked, the more grumpy I was about it.

    Here’s my postcard, which really clearly sums up how I felt in general about the topic:

    Dear Data 2019 – Ann’s week 7, complaints
    Dear Data 2019 – Ann’s week 7, complaints (legend)

    Each column represents a day of data (Monday to Friday) which are chunked into different sections based on the complaint. You can see that Tuesday was not a great day for me, I had 10 different things that I complained about. In contrast, Friday had no data which is more due to me being distracted by other things and less aware of my complaints.

    Each complaint is categorized into a major topic: traffic, the temperature around me, technology, people, and myself. The most vivid complaints for me this week were around the cold. During this week it was extremely cold (comparatively for Phoenix, AZ) and I was in a very drafty building. There’s nothing worse than being cold and trying to work and that was very apparent throughout Monday, Tuesday, and Wednesday.

    Here’s Sarah’s postcard for the week:

    Dear Data 2019 – Sarah’s week 7, complaints
    Dear Data 2019 – Sarah’s week 7, complaints (legend)

    Once again Sarah has done a better job at capturing data detail throughout the week making her postcard more rich with information than mine! I like that she ended up separating out the different buckets into 2 large themes: personal vs. external. I think it probably helps retrospectively to know if the complaints were valid or within her control to change. And I also like the traditional use of a bar chart on the right side to offset the more abstract complaint loops on the left.

    I’m glad to see there are some common themes among our complaints: people, technology, and transportation. We chatted about how cold I was that week afterward and Sarah reminded me kindly that 40 degrees F is not very cold.

    And the best part of the week – mailing off the complaints and being done with data collection on the topic!

    I don’t like how rusty this blue box is.

    Don’t forget to check out Sarah’s take on the week!

  • Installing Tableau Server on Linux (Ubuntu LTS 16.04)

    Installing Tableau Server on Linux (Ubuntu LTS 16.04)

    Over the past six months we’ve noticed a trend – most of our clients are interested in installing Tableau Server on Linux (opposed to Windows). In fact at the recent Tableau Conference, over 25% of new Server installs were attributed to Linux distributions.

    With that sense of growing popularity, we wanted to take some time and walk through a basic installation on Linux. This is similar to our previous post deploying Tableau Server on Azure and is not meant to be a template for sophisticated installations. Instead you can consider this a primer of what you can expect when installing on a Linux machine.

    To start the process you’ll need a fresh copy of Linux on a machine that meets Tableau Server’s minimum hardware requirements (64-bit 2 core processor, 8 GB RAM, 15 GB free space). We chose to install Ubuntu LTS 16.04 via flash memory onto a system with 16 GB RAM, 500 GB SSD, and an Intel i7 4770 3.4 GHz quad-core processor. The server was re-purposed from a previous life as a mid-weight gaming PC.

    At time of writing we downloaded Tableau Server 2019.1.1, selecting the .deb options which aligns with the operating system we selected.

    Throughout the process we like to reference the Everybody’s Install guide that Tableau provides. It helps ensure we don’t forget any steps and use the major content chapters as guides on the outline of the entire process.

    Following along with Step 1: Install Tableau Server package, we quickly went through the process of updating applications on the system.

    sudo apt-get update

    The installation process then directs you to install gdebi, which allows installation of deb packages (the file type that the Tableau Server install package is).

    sudo apt-get -y install gdebi-core

    Finally it’s time for the good stuff – actually installing the software itself onto the server. To do this you’ll run the last command, with the expectation that you’ll know to navigate to the directory where the file is located. For our installation the location is Downloads.

    cd Downloads
    sudo gdebi -n tableau-server-2019-1-1_amd64.deb

    This is a relatively quick process and gives you a good ending snippet of code for next steps – to run the initialization script and accept the EULA. They’ve provided the full path of the script, so it’s best to start at the root when executing.

    sudo /opt/tableau/tableau_server/packages/scripts.2019.1.19.0215.0259/initialize-tsm --accepteula

    And as the last 2 lines indicate, you’re now prompted to login to Tableau Services Manager (TSM) for the first time using your administrator credentials (which are most likely the username and password you’re already logged into).

    There’s then a 4 step process to register your Tableau Server and do some initial configuration. The first option you’ll be hit with is Identity Store. I don’t remember this in the past, but there’s now a handy mouse-over detail to the right of the option box to help guide you on what to select. We chose Local – meaning we won’t be relying on Active Directory for user authentication.

    Tableau Server then runs through it’s final initialization process, displaying in a small window what it’s doing along the way. For reference, this took about 10 minutes to complete.

    You’ll then be prompted to set up a Tableau Server Administrator account. This isn’t necessarily the same username and password as the machine the Server is on, rather it’s the user name of the person who will be managing the Tableau Server itself.

    At this point you’ll jump directly into a fresh copy of Tableau Server, where it even includes an alert to let you know that the samples are still being generated.

    With the Server finally installed we like to go exploring in TSM (Tableau Services Manager). This is the front-end GUI that Server Administrators can access to do a variety of tasks, including restarting the server, adding licenses, generating logs, enabling SSL, and email/SMTP configuration.

    We struggled in the past to find a picture of every screen within TSM, so we decided to include a gallery of each screen below. Click on each picture to see a full size version.

    The last required step to getting started (at least from our perspective) is ensuring the administrator views work. This requires downloading an additional PostgreSQL driver for Linux (it is not bundled in the install). You’ll see this if the driver isn’t downloaded or isn’t installed properly.

    Our initial path forward was to go to Tableau Driver Downloads and download the appropriate driver (as listed).

    After downloading you’ll be able to run the command they provide in the terminal to install the driver. Remember to navigate to the directory the file is in (Downloads) first.

    sudo gdebi tableau-postgresql-odbc_09.06.0500-2_amd64.deb

    Worth mentioning, after we installed this driver the Administrator views were still not visible. And from our perspective it seems that there was an issue with the driver itself. So we chose to downgrade to an older version (that we knew worked from a previous install). We were able to locate driver version 9.5.3 via AWS.

    https://s3-us-west-2.amazonaws.com/tableau-quickstart/tableau-postgresql-odbc_9.5.3_amdb64.deb

    And after installing we finally got a look at those admin views!

    Two more steps we wanted to try out after installing. The first was installing tabcmd and making sure we could connect to the server. By now you’re a pro at navigating to the folder where the install is (and you know to pick the .deb file) – so getting to this step should be pretty easy.

    And our last step was to ensure the tsm command line client was working and to try out a command only available at this level vs. the TSM GUI. We chose to rename our server to JacksonTwo.

    After renaming you’ll notice that we then navigated to the GUI TSM to see the pending changes that needed to be applied. The server required a restart for the name change to be applied. The final result is below, which was also taken on a Windows machine on our network.

    And with that, a fresh install of Tableau Server on Linux has been installed and quickly customized. Keep a close eye on the blog – over the coming posts we’ll be continuing to dive deeper into Tableau Server.

  • Dear Data 2019 – Week 6, Physical Contact

    Dear Data 2019 – Week 6, Physical Contact

    Week 6 postcards of the data project Sarah Bartlett and I are working on are here and I couldn’t be more excited. The theme of week 6 was physical contact.

    During the original project Giorgia and Stefanie tracked people they touched and who touched them, but I decided to switch things up and include my cats. Since I primarily work from home I figured it was a great opportunity to add in additional data elements. I was also genuinely curious to see at the end of the week who gets the most physical affection from me in my house.

    I used IFTTT again to track touches and set up buttons on my phone to represent the 5 major buckets I was likely to encounter: my husband, my cats, family/friends, and strangers. I chose to only represent intentional touches and those that I gave – making data collection a bit less awkward.

    Here’s my postcard:

    Dear Data 2019 – Ann’s week 6, physical contact
    Dear Data 2019 – Ann’s week 6, physical contact (legend)

    Cutting to the chase, the bar chart on the back of the postcard clearly indicates that I give out most of my affection to Starly.

    Starly interfering with work

    Focusing back on the design of the postcard, this week I wanted to continue to push the boundaries and go further into an abstract representation of data. The postcard started out as a 10 x 10 grid that was going to have clearly defined borders for each individual square. The design continued to morph as I started using metallic pen markers that made the edges softer and revealed something I hadn’t considered, the connected blocks (Tetris pieces) of touches. Through the imprecision of my drawing what was originally a very strict grid turned into a more quilt-like representation of my week.

    As two last design elements, I chose to outline the entire pattern with pink – no data elements represented there, but I felt that it completed the transformation that the data took on. And then one cognition piece was adding on dots to help read the grid appropriately, left to right and top down.

    Although the data revealed what I had suspected, visually seeing how connected and integrated my pets are into my life and well-being has been extremely impactful. It’s a reminder of the companionship they offer and of our shared affection.

    Here’s Sarah’s postcard:

    Dear Data 2019 – Sarah’s week 6, physical contact
    Data 2019 – Sarah’s week 6, physical contact (legend)

    I love this design by Sarah. She managed to take a data subject and turn it into a complete picture. I especially like how she chose 3 specific types of touches, hugs, kisses, and handshakes – and then how they correspond to different social circles. It’s amazing when counting the petals how sacred physical contact is to those closest to us vs. colleagues and other outer circle individuals.

    And that’s a wrap on the week – save one last off topic thought I had. After crafting my postcard I couldn’t help but think that it looked similar to some other artwork I’ve seen before.

    Patchwork vs. Postcard

    Although I’m probably biased on creating the connection, I enjoyed the idea that the game had somehow influenced the final drawing.

    A new blue box!

    Don’t forget to check out Sarah’s take on the week!

  • Dear Data 2019 – Week 5, Purchases

    Dear Data 2019 – Week 5, Purchases

    Week 5 of the data postcard project Sarah Bartlett and I are working on has long arrived and this blog post is overdue. I am a tiny bit behind schedule and can’t blame timing on the mail for this week!

    The topic to track and visualize this week was items we purchased. So for this week I tracked every single receipt and line item – basically anything that I spent money on. This was a relatively easy tracking week because I held onto all my receipts and then compiled the data into a spreadsheet at the end of the week.

    I was genuinely curious on what I spend my money on, and this week of tracking was a fairly normal week. I didn’t go anywhere spectacular, host friends for an evening, or have any emergencies (that required money). I consider myself a pretty frugal, not tending to buy clothes or luxury items very often, with one exception, I have very specific food preferences. I don’t like luxury food, but I do follow a very specific diet that includes pricey food items.

    Here’s how the week turned out:

    Dear Data 2019 – Ann’s week 5, purchases
    Dear Data 2019 – Ann’s week 5, purchases (legend)

    For this week I took the approach of going more abstract with my visualization. I didn’t want to represent time directly (as I’ve previously discussed), and I also wanted to veer off course to create an image/drawing/picture based completely on data. What I ended up with were potted flowers. Each pot represents a separate receipt. The leaves represent each individual item on the receipt, the circles represent the amount spent, the color of the circles/buds represent the product category. Finally, if there is a saucer at the bottom of the pot it means that it was a purchase just for me vs. my entire household (me and my husband).

    This is probably my favorite postcard so far – it gave me joy when Sarah shared her reaction to it. And seeing it again in the blog post also makes me quite happy.

    Getting down to specifics – my suspicions were quite true, of the $348 I spent, $315 was on food (that’s 90%). Pretty much the rest of the purchases are luxury in the form of a book and board game. The remaining purchase is airport parking.

    And as you begin drawing conclusions you’ve probably got two thoughts: Ann spends a lot of money every week on food or something else is going on. I try to avoid going grocery shopping every week, so I tend to stock up on things – this was a week of stocking up.

    Conversely here is Sarah’s postcard:

    Dear Data 2019 – Sarah’s week 5, purchases
    Dear Data 2019 – Sarah’s week 5, purchases (legend)

    I like Sarah’s approach of marking whether products were essentials or not. Similar to my card, she’s chosen to denote whether a purchase was for herself or someone else. She mentioned she was sick during this week and you can see exactly when that happened with the medicine purchase! I also like the balance of essentials vs. non-essentials. The final visual is simple and very effective.

    Mailed from inside the physical post office!

    And that’s it for the week. My last lingering comment is that this postcard arrived to Sarah’s doorstep in the fastest time (so far!). I have a strong suspicion that it’s due to dropping it off inside vs. outside the post office. Make sure to check out Sarah’s take on the week!

  • Dear Data 2019 – Week 4, Mirrors

    Dear Data 2019 – Week 4, Mirrors

    Week 4 of the data postcard project Sarah Bartlett and I are working on this year is here. We still have yet to reach consistent timing for postcard arrival. Sarah usually receives mine 2 days or more before I receive hers, but this week we were only one day apart.

    Week 4’s topic was all about mirrors and reflections of ourselves. I was intrigued by this one, I had no sense as to how often I look at myself properly in a mirror. Also, I decorate my house with a lot of mirrors (which you’ll see) – not because I am vain, but because they are great at reflecting natural light and making spaces appear larger.

    I ended up re-purposing my IFTTT buttons for this week, but found the data collection process much less labor intensive. In the original collections from Giorgia and Stefanie, they had both captured accidental glances, however I chose not to go down this path since I would likely spend way too much mental energy determining if it was accidental or on purpose (or turned to having a purpose).

    Dear Data 2019 – Ann’s week 4, mirrors
    Dear Data 2019 – Ann’s week 4, mirrors (legend)

    For the final visualization, I also decided NOT to use time as a dimension. Time has shown up in several of our previous postcards, so it was time to do something different. Instead I chose to represent the 5 different types of mirrors/reflective surfaces that I am around. I also captured some meta data related to the mirrors themselves, with each sketch being a rough estimate of the shape and proportion of each mirror.

    As with previous weeks, I chose to collect from Monday through Friday – and there’s some good insight with that knowledge. Looking at my bathroom mirror, there are 16 glances, 10 of which are me brushing my teeth. After seeing the results, I think what surprised me most was the kitchen mirror. My kitchen is in the center of a very open floor plan, but I didn’t realize how often I used it to check my appearance. In converse, the green mirror (my bedroom) is where I apply makeup or do my hair.

    I’m not impressed with my postcard this week, while I think it is an effective unit chart, I’m struck by the imprecision of the dimensions and some of the sloppy sketching. And the hashing of the corners to denote whether it was at home or not didn’t add much to the overall look.

    And here’s Sarah’s week 4 postcard:

    Dear Data 2019 – Sarah’s week 4, mirrors
    Dear Data 2019 – Sarah’s week 4, mirrors (legend)

    I really like Sarah’s this week. She managed to pull off a lot of depth by using different textures and writing instruments (there’s pencil vs. marker). If my assumptions are correct, then she and I start our mirror glancing the same way – in the bathroom. I also appreciate that she spent more time being specific about what was happening when she was looking at the mirror, and conscious of using mirrors for makeup.

    Mailed Sunday night from my favorite blue box!

    And that’s it for week 4 with mirrors. Don’t forget to check out Sarah’s blog post and get her take on the week.

  • Dear Data 2019 – Week 3, Thank Yous

    Dear Data 2019 – Week 3, Thank Yous

    Week 3 postcards for the data project Sarah Bartlett and I are working on this year have finally reached their destinations. I think we both felt that the mail was slower than normal, perhaps due to the abnormally cold weather here in the US.

    Week 3’s topic was tracking how often we say “thank you.” I knew going into this week that it wasn’t going to be an easy task. I say “thank you” a lot, so I decided to set up IFTTT buttons on my phone. They also show up on my Apple Watch to make it much easier to record the data as soon as it happens.

    IFTTT button set to record values when button pushed

    The recipe for each Applet is very simple. Once a button is pushed it will write a row to a spreadsheet called DD3 with the following columns of information. I customized the last 2 fields based on my desire to capture the medium (in person/virtual) and who the person was. Here are the final buttons, they reside in the widgets area of my phone. I put IFTTT at the very top to make sure they’d be easy to access.

    So many buttons!

    Sarah and I also talked about how we were each going to track data this week and she also ended up using IFTTT. And after using the buttons over the course of the week, I will definitely be reusing this technique for future weeks.

    Now that the data collection backstory is out of the way, here’s my poscard:

    Dear Data 2019 – Ann’s week 3, thank yous
    Dear Data 2019 – Ann’s week 3, thank yous (legend)

    In this visualization each vertical line represents a day of the week (Monday to Friday). Each shape coming off the line is a thank you. Those on the left side are for people outside of my inner circle (business contacts, strangers, people on social media). Conversely those on the right represent my close friends, family, and my husband (Josh). I chose to use shapes to represent whether the thank you was for Josh or not, as seen by the triangles vs. circles.

    The shapes are also plotted in sequential order throughout the day, with the top being the first thank you and the bottom being the last thank you. And the final two pieces are: pink or green to represent in person vs. digital and an additional < next to business related thank yous.

    I also cheated a bit this week and mocked up the postcard in Tableau. I wanted to make sure the faint idea I had in my head would look okay on paper. You’ll notice very quickly that quite a bit of the detail was reduced for the postcard.

    Initial visualization in Tableau

    I really enjoyed the pattern that my data revealed this week. On most days I start my morning sending thank you emails and then as it builds, I end up leaving my house or talking to other people. The thank yous I dish out to Josh seem to be very dependent on what the focus of the day is.

    In contrast, here’s Sarah’s week 3 postcard:

    Dear Data 2019 – Sarah’s week 3, thank yous
    Dear Data 2019 – Sarah’s week 3, thank yous (legend)

    I say “in contrast” jokingly here, because I think we took a VERY similar approach. Not only superficially in the usage of lines for passage of time and the choice of triangles, but also in the choice of detail we decided to track. Her breakout of people is similar to mine, leaving a special place for her husband, and carving out social layers from friend/family, to work, social media, and finally strangers. One thing she did that I really like was include the vertical line for AM/PM. I think that adds a little more context to the flow of each day.

    Same blue box as last week!

    That’s a wrap for this week! Don’t forget to check out Sarah’s blog post and get her take on the week.