Skip to main content

From Spreadsheets to Tableau: Creating Dynamic Data Visualizations

Data visualization can be a tough undertaking for many organizations. Choosing which metrics and how to visualize them in an effective manner can be challenging when you are limited to just tools like excel. Additionally, how do you keep them up to date with the everchanging demands of your business? Alexa Tipton explains how she partnered with a client to achieve just that. 

 
Can you tell us about your background and current role at Data Tapestry?
Currently, I’m a data scientist at Data Tapestry. Before that, I was a research assistant at UT Knoxville for the center of ultra-wide area resilient electric networks or CURENT. While there, I used a number of techniques including text mining, machine learning, and NLP to analyze tweets from Twitter. The goal was to understand the public’s sentiment towards their energy providers around natural disasters, but more importantly improve electric grid structures overall as part of the national science foundation project. 

Tell me about how your project started with the client. 
My project consisted of several discrete parts, but overall there were two main dashboards. The first was a dashboard to visualize physician salary shortfall. So, clinicians bill insurance various amounts via CPT codes for different procedures. As part of their contract, they are contractually obligated to bill a certain amount relative to their salary. If they’re billing lower to than that amount, they are in shortfall and might enter into a probationary period with some potential consequences. If they’re billing over that amount, then they would get rewarded. The hires and terminations dashboard was designed to look at hiring and terminations of staff over time. The overall goal was to eliminate relying on spreadsheets for this information.

Can you describe your relationship with your stakeholder and how you managed the project to ensure forward progress? 
We checked in via weekly progress meetings. I didn’t want to go too far in one direction because it was necessary to receive feedback in order to refine the visual. There wasn’t a set of formal requirements. So, I had to think through how to best visualize the information and display it in a way most helpful to the client. Occasionally, I’d have to advise against certain additions or revisions so the project could stay on track. 

What type of techniques did you use for data mining and why?
For one of the dashboards, they wanted to see month to month changes over time. There was a lot of data that was not formatted correctly since much of it was manually entered. So, I had to manipulate the data to get a monthly snapshot. 

Describe the nature of the deliverable you provided and how it helped the client. 
The deliverables were two different Tableau dashboards. The salary shortfall dashboard replaced a quarterly report that had to be manually compiled via spreadsheets and then emailed out to the relevant parties. Now the dashboards can be easily updated on a daily basis. You can look at the most up to date version. Both dashboards save time and eliminate the prospect of human error when manually collecting and presenting this information. 

If you are interested in how Data Tapestry can help your business, email us at business@datatapestry.ai or visit our website datatapestry.ai

Comments

Post a Comment

Popular posts from this blog

Transforming and Accessing Data through Custom Built Pipelines

One of the biggest hurdles in data analysis is just getting access to data in the first place. At Data Tapestry, we offer end-to-end analytics services beginning with data acquisition, performing analytics, and providing end user products. Keith Shook walks us through how to maintain data security and integrity when dealing with a variety of situations. Tell us a little about your background and your role at Data Tapestry. Currently, I’m a senior data engineer, but I actually started off as an intern ingesting data into Postgres and SQL databases. I then shifted into visualization using D3, a javascript library, but we found that Tableau was much more efficient. Since then, I’ve gained a variety of experience using scala, Hive, AWS, and building clusters.   Can you walk us through a project you’ve worked on? Data engineering is pretty straightforward as far as the process goes. You get the data, ingest it into the database, and then hand it off to the data scientist. You have to be fle

Automating Visualizations and Implementing Standardized Data Collection Practices

Creating automated visualizations can be difficult when working data that has not been standardized. In a large hospital system, standardization requires multi-level communication across many departments as well as strict adherence to those standards so that processes can be implemented. Alex Ratliff talks us through how he created a dashboard around ever changing standardization issues. Tell me about your background and role at Data Tapestry. I’m currently a data scientist at Data Tapestry. My background is in math and analytics. During the first three years of my career, I worked at a company that contracted with the department of defense. While I was there, I worked on making dashboards to monitor data flow to make sure the data was being processed correctly. There were different sites that make data transfers. Our job was to make sure each job had the proper amount of bandwidth. I mostly used R and SQL to manage that.   Tell me about the dashboard project you worked on. When I star