Trending December 2023 # An Introduction To Tableau On Making Raw Data Useful # Suggested January 2024 # Top 21 Popular

You are reading the article An Introduction To Tableau On Making Raw Data Useful updated in December 2023 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested January 2024 An Introduction To Tableau On Making Raw Data Useful

This article was published as a part of the Data Science Blogathon.


With this Tableau tutorial, you’ll learn how to visualize data and derive valuable insights from raw data, making dashboards, reports, tables, and more. We will also go through topics like Tableau’s desktop, server, and the various component of Tableau.

Tableau is a useful application for business intelligence and data visualization. It is used extensively for generating and distributing highly interactive and robust reports and dashboards that visually present the data like charts, graphs, plots, and so on. All the leading institutions have been built on data, so it is imperative to them that they make the most of the data sources available to them.

However, the available raw data is complex and hard to interpret, and it provides the solution for this. For acquiring and processing data, Tableau can connect to many files like excel, csv, JSON, etc. it can receive and process all types of data like structured(.csv), unstructured(json), and significant data sources.

Applications of Tableau

Data can be diced and sliced using Tableau’s features, and then we can create appealing representations. If you display the same data in the form of bars, charts, graphs, and plots, you will be able to decipher it. It is possible to detect hidden patterns, relationships, trends, and new meanings so that you can make an informative & valuable business decision. It’s major applications are as follows:

Come up with quick visualizations and insights on different data sources

Allows harnessing your databases and maximizing query performance

Using statistics like trending and forecasting

Integrate powerful table calculations with computer programming languages like R

Utilize Tableau dashboards to interact with data in the most intuitive way possible

The Components of Tableau

Tableau products work together and help users create data visualizations and generate reports by seamlessly transferring data between the products. Here is a list of the products or components:

Tableau Desktop lets you import data from different sources and create dashboards, stories, and workbooks. With Tableau Desktop and its website, you can share all the insights with others and publish the workbooks online. Tableau-desktop allows users to run direct queries on datasets without typing in a code. You only need to write the columns you wish to include and add visualizations like charts, tables, graphs, and maps. It can combine many sources of data into one dashboard.

It is used for publishing the reports and workbooks created in Tableau Desktop. The user can access workbooks and reports from anywhere. With it, you can access the latest content and workbooks, and reports generated by other users. To maintain security, the Tableau-Server admin has the right to set an authorization on specific projects, views, workbooks, and data sources.

It’s a free application you can install on your desktop and use to view the data visualizations built by users of Tableau Desktop. By using Tableau Reader, you can view, interact, add different types of filters, and drill down into the data without affecting the original datasets and visualizations.

Tableau Server lets you create workbooks and reports that can be saved, but anyone can view the workbooks because they are open to everyone.

Tableau Online is a platform that makes it easy to publish dashboards and share them with other users. The tool allows you and your colleagues to work on a project together and extract valuable information that can be transformed into visually interactive workbooks. Visuals can be accessed via the website, Desktop, and Mobile.

Different Types of Visualizations

The most important visualizations of tableaus that are used widely are as follows:

Line Graph: Used primarily for constant dimensions

Bar Graph: For dimensions that are not constant

Dual Axis Graph: Used to present two different variables or measures at the same time

Geographical Graph: For measuring sales and plotting other data on geographical maps

Area graph: Comparing measures

Tree Map: Nested rectangles are used to present quantities

Heat Map: The tool is used for measuring differences across different categories

Tableau Maps

The best way to depict geographical data is with Tableau maps. The main purpose of maps is to visualize comparing data in different topographical areas. The following are its various applications of it:

Proportional Symbol Maps: 

These maps are mainly used to visualize quantitative data for specific locations. Data of two or more quantities or variables can be added per location. An Earthquake map that shows the magnitude for the previous 10 years can be shown in a proportional symbol map.

Point distribution Maps: 

The data point of a specific location is shared. A point distribution map can be used to visualize events that took place at a particular time. To create distribution maps, you need to know the latitude and longitude of your data source.

Heat Maps: 

Heat maps visualize large volumes of data and spot trends that help the user make better decisions. You can create a heat map of your site data and see where users are coming from.

Flow Maps:

A flow or path map in Tableau shows the journey of an object from one place to another. An example of this would be a thunderstorm or hurricane. Over time, a hurricane’s flow map shows its path from the origin to the end.

Spider Map:

The origin-destination maps show the path for multiple destinations or a single origin. An example of a spider map would be the visualization of migrant data who moved from one country to another.


Tableau is one of the data visualization technologies growing exponentially, primarily in the business intelligence sector. It is swift to deploy, simple to use, easy to learn, and very intuitive for customers. It aids in converting raw data in an effortless and lucid way. It does data processing quickly and produces dashboard-style graphics. Solutions from Tableau Learning is applicable in many department and industries.

This introductory tutorial is intended for all students who desire to understand the software and operate in the business intelligence industry. Anyone can go through this to understand the basics, whether it is to work in data analytics, data visualization, or business intelligence.

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.


You're reading An Introduction To Tableau On Making Raw Data Useful

An Introduction To Using Zenmap On Linux

Zenmap is a cross-platform application which is available for Linux, Windows and OS X. Other than any Linux specific information, like the installation process, this tutorial applies equally to all of the supported platforms. Talking of the installation process, you can install it on Ubuntu using the Ubuntu Software Center (just search for “zenmap”) or from the command line using:


apt-get install


The above command also works on the Raspberry Pi and probably most other Debian or Ubuntu derived distributions. For other distros that use yum, like Fedora, then use:



"yum install nmap-frontend"

Although Zenmap can be launched via the desktop, it is however best to start it via the command line with root privileges, otherwise Zenmap can’t use some of nmap's functionality.

To start it on Ubuntu run:



There are two main ways to start nmap scan using Zenmap, either by entering a target address and selecting a scan type from the “Profile” drop-down list or by entering the command directly in the “Command” field. If you are familiar with nmap or you want to try out some of the commands from the previous articles, you can use the “Command” field directly.

The power of Zenmap is that it stores and sorts all the information from any scans performed and allows you to build up a picture of your network. The easiest thing to do is a Ping scan to see what devices are alive on your network. In the “Target” field enter and select “Ping scan” from the Profile list. If you are using a different network range from 192.168.1.x then I will assume from here on that you know how to enter the correct range. For more details, please see the previous parts of this series.

Down the left side of the window, you will see a list of the devices (hosts) found on your network and on the right, the output from the nmap command. Above the output pane is a set of tabs: “Nmap Output”, “Ports/Hosts”, “Topology”, “Host Details” and “Scans”. Each one of these tabs shows more information about your network and the information presented is accumulative. This means the more scans you do, the more information is available.

Run an Intense scan against to discover all the open ports and operating system on each host. After the scan, the OS icons will change in the hosts list on the left and the Ports/Hosts tab plus the “Host Details” tab will offer more information about each host.

Each circle on the diagram represents a host found on the network. If a host has less than three open ports, it will be green; more than three but less than six open ports, yellow; and more than six open ports, red. Hosts with filtered ports will have a yellow padlock symbol next to them.


As a further exercise try using some of the scans listed in the first two parts of this series by entering them directly into the “Command” field. Also if you want to permanently add these to the “Profile” drop-down list then use the built-in profile editor (under the Profile menu). The profile editor is also a good way to experiment with other scan parameters since the editor itself presents many of the nmap options as part of its user interface.

Gary Sims

Gary has been a technical writer, author and blogger since 2003. He is an expert in open source systems (including Linux), system administration, system security and networking protocols. He also knows several programming languages, as he was previously a software engineer for 10 years. He has a Bachelor of Science in business information systems from a UK University.

Subscribe to our newsletter!

Our latest tutorials delivered straight to your inbox

Sign up for all newsletters.

By signing up, you agree to our Privacy Policy and European users agree to the data transfer policy. We will not share your data and you can unsubscribe at any time.

An Introduction To Mobile Seo

Mobile is to SEO what glaze is to Krispy Kreme. You can’t have one without the other. It is the backbone of Google’s index.

Sure, the mobile-first index just rolled out in 2023, but Google has been dropping not-so-little hints for the past few years.

In 2023, Google announced the mobile searches surpassed desktop. Then in 2023, mobilegeddon 2.0 rocked the SEO world. And, in 2023, Google introduced us to the mobile-first index.

But the question still remains:

What should my mobile strategy be?

It isn’t enough to have a mobile-friendly site.

This post will tell you all you need to know to get started with mobile SEO.

Step into your future with the basics of mobile SEO.

How Google Deals with Mobile Search

If it isn’t obvious yet, Google clearly favors mobile search.

But it can be quite confusing understanding how Google deals with mobile search.

So, here’s the lowdown on some common FAQs about mobile search and Google.

What URL Does Google Index If You Have Separate Mobile and Desktop URLs?

Google will display the desktop URL for the desktop searches.

And, the mobile searches will get the mobile URLs.

But, the indexed content (the big chalupa that determines how your rank) will be from mobile.

URLs in search: With Mobile-first indexing, we index the mobile version. When we recognize separate mobile URLs, we’ll show the mobile URL to mobile users, and the desktop URL to desktop users – the indexed content will be the mobile version in both cases.

— Google Webmasters (@googlewmc) June 14, 2023

Will I Lose My Ranking Positions with the Mobile-First Index?

But, mobile-friendliness is a ranking so your UX is still important.

Mobile-friendliness is reviewed page-by-page, which means you’ll want to update your money pages first.

On ranking: The mobile-first index doesn’t change anything for ranking other than that the mobile content is used. While mobile-friendliness is a ranking factor on mobile, being in the mobile-first index is not.

— Google Webmasters (@googlewmc) June 14, 2023

Allow me to let you in on a little secret:

Google wants both your desktop and mobile site to have the same content.

If you have the same content (like a responsive design), you will see no impact from the mobile-first index.

Plus, on the bright side, Google sends notifications to let webmasters know the mobile-first indexing is going down on your site.

Is Your Site Mobile-Friendly?

To help you find out if your site is mobile-friendly, here are some of my favorite tools.

Best Practices for Mobile SEO

Let’s break down how to optimize your site for mobile search.

We’ll start by exploring a few mobile SEO best practices and techniques that apply to all mobile sites.

Mobile Content

To sum up mobile SEO, you want the same exact content from your desktop on your mobile site.

All content formats (text, videos, images, etc.) should be crawlable and indexable in mobile.

Google updated their app and mobile results to display badges for image search.

This means those image alt attributes that you’ve been ignoring are becoming even more relevant in mobile search.

I mean, if Google can already recognize dog and cat breeds through photos, can you imagine what’s next?

Also, with the rise of voice search, you may want to consider aligning your content.

For example, I would recommend optimizing your meta titles for mobile search because they are shorter.

Remember, voice search is performed from a mobile device, so it makes sense to optimize your mobile site.

Voice search = mobile device.

This means redefining the way marketers perform keyword research.

Long-form queries and questions are dominating the SERPs, which is why things like featured snippets are having a major impact.

It’s about user intent now.

Mobile Site Performance

To quote Top Gun, “I feel the need for speed.”

Yes, Google is feeling the need for speed as the official mobile “Speed Update” hit the scene.

This is why Google introduced the Accelerated Mobile Pages Project to improve site speed and page load times for mobile content.

AMP allows content to be cached and served directly within a SERP (rather than sending the user to the original website).

This is also why the industry will start to see AMP pages integrate with PWAs.

I would recommend using responsive design as well as AMP pages.

For example, using AMP pages to serve your blog posts and services pages if you’re an SEO agency may be something to consider.

And, if you want to get really deep into page speed, listen to this podcast with Bastian Grimm and Brent Csutoras as they discuss paint timings.

Making Your Website Mobile-Friendly

There are three main approaches to making your website mobile-friendly:

Responsive Design

Adaptive Website

Separate Mobile Site

Here’s how to optimize each.

1. Optimizing Responsive Design for Mobile Search

There’s a mistaken belief that if your site is responsive then it’s automatically mobile-friendly.


Let me explain.

Responsive design maintains the same website content from desktop to mobile. It takes the same URLs, HTML, images, etc.

However, responsive design does not mean that the needs of your mobile visitors are met.

Responsive design still needs to be optimized for user experience.

With that said, Google has stated that responsive design is their preferred option for mobile SEO, but has confirmed there is no ranking boost to gain from having a responsive site.

And, based on a study by Appticles, published in Smashing Magazine, that responsive websites are the most common mobile-friendly site holding it down at 51.11 percent.

Here is what you need to know about optimizing your responsive design for mobile SEO:

Scale Images

Images already scale with responsive design. But, they may not be necessary for your mobile users. I’ll show you an example.

Here’s Navy Federal Credit Union’s homepage on desktop and mobile.

Now, here’s Bank of America’s homepage on desktop and mobile.

Bank of America, right?

Navy Federal’s desktop top image takes over the mobile website with no call-to-action. On the other hand, Bank of America’s CTA’s are front and center.

Key takeaway: Scale images for mobile users if you’re using responsive design. Ask your developer to create alternate images for different viewports. Don’t forget to update the meta name = “viewport.”

Clean Navigation

I would recommend monitoring your mobile user behavior to understand what they are searching for on your site.

Then, tailor your navigation to their needs.

For example, Best Buy keeps their navigation at the top with their main pages along with a hamburger menu in the center.

Side bar: Google has confirmed that hamburger menus are “fine” to use in mobile design.

Key takeaway: Keep your mobile navigation simple and clean. Limit the main pages to 4-8. If you  need more, utilize a hamburger menu.

Delete Mobile Pop-Ups

Google wants your mobile users to find what they need and fast.

To help their mobile users, Google introduced a new signal in 2023 that stated:

“Pages where content is not easily accessible to a user on the transition from the mobile search results may not rank as high.”

That’s not to say all pop-ups are bad.

Here’s an example of a bad pop-up:

The newsletter takes up the entire screen without letting users read the content behind it.

Here’s an example of a good pop-up:

The image does not take up the full screen and the visitors can still see the content.

Key takeaway: Proceed with caution when it comes to pop-ups. There is proof that pop-ups do work. In fact, Sumo saw a 9.28 percent conversion rate for their pop-ups. Just tread lightly.

Shorten Copy

Desktop copy does not often translate well to mobile copy.

Lots of text on a mobile site can crowd and overwhelm mobile users.

I like to keep things simple by reducing text. Let me show you.

Now, here’s their mobile site:

The copy is reduced above the fold to keep the CTA clear and concise.

They pushed the longer form copy down for users to learn more if they scroll.

Key takeaway: Less is more. Keep conversions high by reducing the amount of copy above the fold. Entice users to scroll with the intitial text, then give them the full monty after scrolling.

Design CTAs

iAcquire and SurveyMonkey discovered that 70 percent of mobile searches lead to action on websites within one hour.

But, mobile conversions are lower than desktop.


The call-to-action is not clear.

Here’s what I mean:

Look at’s mobile site:

They require the user to scroll to see the CTA button.

It’s likely they are losing out on mobile conversions.

Now, here is an example from Flywheel.

You can see that they want users to “Sign up for free” to use their product.

2. Optimizing an Adaptive Website for Mobile Search

An adaptive (or RESS/dynamically served) site uses the same URL, but the server sends a different version of the HTML (and CSS) based on what type of device is requesting the page.

You essentially have three different versions of your website:




Amazon is a great example of adaptive web design:

So, why did Amazon choose to use an adaptive web design over responsive?

Mobile Marketer reported that Amazon chose adaptive design to increase page speeds on mobile 40 percent.

If you’re a small business, I’d recommend going with the popular vote of a responsive design.

Adaptive websites require more resources.

Here is what you need to know about optimizing your adaptive website for mobile SEO:


Google will devalue your site if you’re showing one thing to the search engine and something different to the user.

This is cloaking.

To fix this issue, ask your host to use the Vary-HTTP Header.

This will guide the mobile crawler to the separate mobile content and signal to the server what type of device the user is coming from.

Customize Design

With adaptive design, developers have full control over the layout, display, and graphics.

If you’re website is tailored to multiple countries, then you may want to swap out the design elements based on region.

The downside to this is that you’ll have to manually update each version of the site.

For example, you can serve custom meta titles and meta descriptions that target mobile users.

Combine Adaptive with Responsive

There is an alternative before going knee deep in adaptive.

You can utilize responsive design with adaptive logic.

Developers can customize for mobile users using client-side JavaScript.

3. Optimizing a Separate Mobile Website for Mobile Search

A separate mobile site (or m-dot) is a completely different site.

The same basic SEO principles remain the same for your desktop, tablet, and mobile, but there are a few differences.

Here is what you need to know about optimizing your separate mobile website for mobile SEO:

Separate URLs

Each desktop URL should serve a mobile URL.

You will need to add the canonical URL rel=”canonical” tag to the mobile site pointing to the desktop URL.  Like this:

<link rel=”alternate” media=”only screen and (max-width: 640px)”

This can also be done in the sitemaps.

Implement Mobile Switchboard Tags

Switchboard tags are similar to canonical tags, they tell Google a mobile URL exists with a rel=alternate tag.

Without switchboard tags, Google may not crawl the mobile versions of your URL.

You will need to add the rel=”alternate” tag to the desktop URL that points to the mobile URL. Like this:

This can also be done in the sitemaps.

Detect User-agent Strings

Double check your redirects to make sure that your desktop URLs  coordinate to the correct mobile URL when redirecting.

Otherwise, you could create a faulty redirect (not good).

Luckily, Google Search Console will detect these faulty redirects for you.

Search Console Verification

Make sure you verify the mobile version of your website in Google Search Console.

Structured Data

Always include the same structured data on your mobile and desktop sites.

Your URLs in the structured data on mobile pages should be the mobile URL.


If you’re a global company using rel=hreflang, make sure your mobile URLs with the hreflang point to the mobile version of your country.

XML Sitemaps & Robots.txt

All links in the sitemaps should be available from the mobile version of the site.

This includes your chúng tôi file, too.


For all late nights cursing my laptop and stress eating that mobile SEO has caused me over the past few years, I’ll be the first to admit that the mobile-first index felt pretty blah.

The majority of the sites I work on are already responsive.

But, if you live for your separate mobile site, I won’t stop gushing about how important a uniform URL structure can be.

The end goal remains the same:

Allow the crawlers to access, read, and digest your content.

Optimize the UX for all devices.

Continue testing for better results.

Mobile search is no longer the future of SEO.

It’s here.

Do you have what it takes to make it out alive?

Image Credits

Featured Image: Paulo Bobita

Screenshots taken by author, September 2023

Introduction To Azure Data Lake Storage Gen2

This article was published as a part of the Data Science Blogathon.

Introduction ADLS Gen2

The ADLS Gen2 service is built structured, semi-structured, and unstructured data in their original file formats. For example, it can store Text files, CSV files, JSON files, XML files, images, videos, etc. When the uploading of files gets completed after that we can use any technologies like Databricks, or Hadoop, to process and analyze the data as per our business needs.

Data Lake Storage Gen2 makes Azure Storage the inspiration for building enterprise knowledge lakes on Azure Cloud. it’s designed to service multiple petabytes of data while sustaining many gigabits of turnout, Data Lake Storage Gen2 helps you to simply manage huge amounts of data.

A fundamental part of Data Lake Storage Gen2 is the addition of a hierarchical namespace to Blob storage. The hierarchical namespace organizes objects/files into a hierarchy of directories for efficient data access.

In this article, we will explore Azure Data Lake Storage Gen2 (ADLS Gen2) service. We are going to c orage containers and upload the data inside the containers from our local system.

Steps to Create ADLS Gen2 Storage Account


Need to have at least an Azure free tier subscription. I will use a free tier subscription to perform the below steps.

Step 2 – Now we will create the storage service.

Step 2.2 – Type “storage account” and select the only option “storage account” to create this service.

Step 2.4 – Provide the following information in the “Basics” tab of the “Create a storage account” page –

“Project Details” Section –

Subscription – Select the proper “Subscription” to use from the Dropdown.

Resource Group – Select “RG-Storage” from the Dropdown of “Resource Group”.

          “Instance Details” Section –

Storage account name – Type the name “blogdemostg ” in the provided input box.

Region – Select the default option “East US” from the Dropdown of “Region”. If you want you can change it as per your choice.

Performance – Select the option “Standard: Recommended for most scenarios (general-purpose v2 account)”.

Redundancy – Select the “Locally-redundant storage(LRS)” Option in the Dropdown.

Step 4 – Finally our service gets launched and we can see all the credentials that we have defined during the creation of the ADLS Gen2 storage account.

Uploading Data in ADLS Gen2

We have successfully created our first ADLS Gen2 storage account. Now, we will upload the data inside it using Microsoft Azure Storage Explorer. To upload the data we are going to create a folder with the name “raw”. Inside this folder, we are going to upload our data. Let’s go…

Steps to upload the Data in ADLS Gen2 Storage Accounts

Step 2 – Type the name of your folder inside the provided box as “raw”.

Step 5 – Now your file gets uploaded into the Azure storage. You can check the “Activities” section which will display the status of your task, and whether your task is successful or gets failed.

This article has covered the following topics:

Details about ADLS Gen2 Storge Service.

Steps to create your fi

Connecting your Azure account with Microsoft Azure Storage Explorer.

Creating containers in your ADLS Gen2 storage account with the help of Microsoft Azure Storage Explorer.

Uploading the data files and folders inside your containers from your local system.

Happy Learning!

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion. 


Is The Tableau Era Coming To An End?

The announcement last week that Tableau’s CEO Adam Selipsky is stepping down felt more significant than the casual media coverage it received. To me, it was a signal that the murmurings of discontent I’ve been hearing were true: Era of Tableau is over.

The Glory Days

While Tableau first came about in 2003, they really hit their stride in the early 2010s — and what astride it was. Users heralded the tool as ‘revolutionary’ and ‘life-changing.’ Their annual conferences sold out in minutes. Participants would come together with hundreds of others, proudly brandishing swag that read ‘We Are Data People’ as they attended roller-blading socials and “Iron Viz” competitions. As I said, it was having a real moment.

For many of us (I, too drank the kool-aid), it was affirming and exciting to see data being celebrated, not relegated to the sidelines. Tableau told us being in data was not just cool, but also irrefutably important.

What’s Changed?

But instead of this being an even more glorious Glory Days, it’s an all-too-often underwhelming experience all around:

“Machine learning specialists topped its list of developers who said they were looking for a new job, at 14.3 per cent. Data scientists were a close second, at 13.2 per cent.” [1]

And even more damning:

“Among the 90% of companies that have made some investment in AI, fewer than 2 out of 5 report business gains from AI in the past three years.” [2]

Eesh. Clearly, there’s work to be done.

The Haunting

So what are these ghosts that are getting in our way?

Data === Dashboard

To many business users data is now synonymous with dashboards. While a seemingly benign misunderstanding, this actually causes a whole slew of downstream effects, namely:

Thinking Tableau will ‘fix’ your data problems. Many companies make the mistake of assuming the only thing your data team needs is Tableau (or Power BI). This kind of thinking ignores the more common pain points of bringing data sources together, cleaning and transforming the data, and doing the actual analysis itself, which, if you ask any analyst, are the most traumatic parts of any analysis. By not investing in these problems, you’re telling your data team that their work is less important than the business’s interpretation of it.

Asking dashboards to do too much. Since Tableau is the only tool many teams have to present data they are forced to turn everything into a dashboard which significantly reduces the impact a more nuanced, thoughtful analysis could have. By stripping away context, explanation, and narrative from the analyst, dashboards become a Rorschach test where everyone can see what they want to see.

While users are now more comfortable looking at basic charts, we’ve made little progress in educating our business partners in fundamental data concepts. Dashboards don’t give us the stage needed to explain, for example, why correlation does not equal causation. This means it’s become nearly impossible to explain the significance of our more complicated predictive models or statistical analysis which are required to realize the dreams of our current era.

Hyper Specialization of Tools

One of the great things about Tableau at the start was that it just sat on top of your database, making it easy to ‘plug in’ to your existing stack of data tools without much effort. This model has been used by pretty much every data tool since, creating separate tools for data pipelines, data cleaning, data transformation, data analysis, and of course, data visualization. This approach is completely fragmenting analyst’s workflows, causing significant pain and delays in each analysis. As a result, most analysts and data scientists have adopted a ‘not my data tool’ mentality — acknowledging Tableau as a necessary evil to get their work noticed. Check out this Reddit thread to see for yourself.

“If there were a button that would nuke all the Tableau servers in the world, I am pressing that button.” -Anynomous Data Professional

Remember those ‘murmurings of discontent’ I mentioned at the start…


We have an increasingly urgent need to find solutions to these issues before we find ourselves again fighting for relevancy and attention to data. To do that, we need to start focusing on the following two areas:

Present more than numbers

It’s time to give data more of a voice. Dashboards are great for things where there is a shared context and a straightforward decision. But for many things, those conditions are not met, and therefore we need a new approach.

I, and others, have been banging the drum on data notebooks as a solution for some time now. They can tell the story, explain the methodology, and build nice visuals without sacrificing interactivity or presentability.

By using more notebooks we can start to wean off a culture that’s been jonesing for dashboards. We can start to work with our business partners instead of lobbing questions and charts back and forth over an imaginary wall.

Pick tools the data team wants

Data analysts and scientists see a red flag when a potential employer has Tableau and little else in the way of data engineering, or data analysis tools (e.g. running Tableau on your un-transformed MySQL 5 database). This signals that they aren’t prioritizing the work that these analysts will do. This needs to stop. ASAP.

Depending on the analysis your team is doing, the ‘right’ tools will differ. But there are so many options out there, you just need to make sure you’re investing in the work it takes to make the great analysis as much as you are on a tool make the business look at it.

And hey, you’ll probably end up keeping some of those data scientists that are, according to the stats, most likely shopping around.


We all owe a great deal to Tableau for the current attention data receives in our businesses. To make good on this opportunity though, and move into a new Golden Age of data, we need to address and remedy some of the ghosts of the Tableau era that are holding us back.

Data notebooks present an option that can give your team the flexibility it needs to start to move past the Tableau and into the next era.

At Count, we’re excited to be part of this new movement of data tools designed for modern challenges. You can learn more about the Count notebook here.


[1] Walter, Richard, “

[1] Walter, Richard, “ How machine learning creates new professions — and problems ,” Financial Times, November 2023. [2] S. Ransbotham, S. Khodabandeh, R. Fehling, B. LaFountain, D. Kiron, “ Winning With AI, ” MIT Sloan Management Review and Boston Consulting Group, October 2023.

[3] Header image by Luke Chesser on Unsplash

The media shown in this article are not owned by Analytics Vidhya and is used at the Author’s discretion.


Http Request In Power Automate – An Introduction

Despite having a variety of connectors that we can choose from, Power Automate also has its limitations. It can’t cover everything we might need in order to build our workflows. There might be an instance when we want to integrate or trigger our flow using an application that’s not available in Power Automate.

The HTTP and Request connectors allow our flow to interact with third party APIs.

We’ve used Request to trigger a workflow in one of our previous tutorials. In that example, we had a third party application that triggered a Power Automate flow with approvals and conditional logic.

And that’s how we can basically connect a third party application to our flows. 

On the other hand, the HTTP connector allows us to ping any third party API that we have. So it’s not necessarily used as a trigger. It’s mostly used as an action.

Let’s say we have a flow where we have to get some information from an API. We send the customer’s ID via that API and we get back the customer’s name or passport number. Then we need that information in our Power Automate logic.

But we don’t want to keep that sensitive information within Power Automate. Therefore, we need to create an API in our third party system that can take in HTTP requests. And that’s what we’re going to do as an example.

I currently don’t have a third party API. So for this particular example, we’ll be using a website called chúng tôi It has a third party API endpoint that we can use in order to test if our flow is working or not. For example, they have a function called LIST USERS that contains a corresponding request. And if we request it, the items within the Response column is everything that we can get back.

Then, copy the URL. This serves as the API endpoint.

Then paste the URL that we have previously copied.

We can also enter headers, queries, and cookies if we want to. However, we’ll be skipping that for now.

Search and select the Slack connector.

Let’s post the contents to the random channel.

For the Message text, let’s choose the Body and see what we find.

Let’s rename our flow to HTTP Example.

Add a new step and choose the Data Operation connector.

Then choose Parse JSON.

Move the Parse JSON step in between the HTTP request and the Post message action.

Copy the codes from the request link.

As we noticed, it automatically created the schema that we can easily understand.

For the Content field, we need to put in whatever we want to parse. In this example, we want to parse the Body that we get from the HTTP request.

And why is this important? Well, we don’t have to post the whole body of the message anymore. Now, we have access to more dynamic contents such as email, first name, last name, avatar, company, URL, and many more.

Parse JSON allows us to take any JSON output we get, parse it into different dynamic content that we can then use later on in our subsequent flow steps. We used this action so that our flow won’t Slack the whole JSON content, and only displays the relevant information that we actually need.

Now, instead of Body, let’s change this to first_name.

Then, we’ll see that it changed our action into Apply to each.

This is because our flow receives multiple first names from the request.

As we can see, it only displays the first names now.

All things considered, we’re able to create an HTTP request that integrates our flow to a third party application. From there, we parsed the JSON content using Data Operation connector in Power Automate. It automatically generated a schema using a sample JSON payload. By parsing the JSON, we transformed a typical response from an HTTP request into a more relevant and understandable piece of information.

We can definitely do tons of different things with the HTTP request and Parse JSON actions. Hopefully, you were able to understand their importance and how they work. 

All the best,


Update the detailed information about An Introduction To Tableau On Making Raw Data Useful on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!