In this escapade, we are going to be looking at six strong offices within Google Sheet that every SEO needs to know. Google Membrane is a strong implement that lets us look at our data, manipulate our data, and dive deeper in to get a better understanding of the things that we need to do in the areas that we need to improve in. I’m going to show you some cool affairs that I use almost every single day in order to help us grow now as an agency and facilitate face opportunities for our purchasers as well. If you’ve got a favorite perform and we don’t put-on it in this video, feel free to share. We’d love to continue that conversation here in the community and cure everybody be the very best they perfectly can be.
In this video, we’re going to be talking about six Google Membrane gatherings that every SEO should know. I’m going to walk you through each of the functions. Then I’m going to show you examples of how we can use those within Google Sheets. This can really speed up your time and also help you combine a number of different data points, which is very useful when you’re trying to really understand what’s going on with a particular project or a specific dataset.
The first one is probably the one I use the most and it’s VLOOKUP. VLOOKUP is going to be used for draw data regarding one spreadsheet into another, though, there is a caveat to this. You have to have at least a shared quality within each of the Sheet. You need to have either a keyword on each of the Sheet or a URL on each of the Membrane, because it needs to have that vertical lookup in order to find the information. The syntax is relatively easy.
It starts with equal ratify VLOOKUP. Next, you’ll have the search key. This would be the horizontal schedule of cells that you want to use as the key. Next, you’re going to have the wander or the series of cells you’re going to be looking at on the other Sheet, the indicator, so the actual bit of information you need, and then how you’re going to kind that intelligence. Let’s take a look at how we can leverage VLOOKUP. In this dataset, I have a list of top queries that we attracted from Search Console. Search Console is giving me the clinks, the marks, the click-through rate, and the average position that we rank for each of these exploration inquiries. Now, this is really helpful information, but what if I want to know the search volume for these inquiries as well? I can go into my favorite SEO tool and I could pool all the search data like we’ve done here and then compare it Sheet against Sheet.
Used for: Attracting data from one spreadsheet into another spreadsheet- as long as the two expanses share a common value.
Function:= VLOOKUP( search_key, scope, index, is_sorted)
Now, this is helpful, but honestly, copying and pasting the capacity over could be a pain in the rear end. This is why VLOOKUP is so strong. What we need to do is start on this sheet, the search queries page, the page we want to add the search volume to. We go ahead and please click here and put a pillar and we’ll call this column magnitude. In order to find the search volume for this term, which is located on this second Sheet now, we want to use VLOOKUP. So we’ll make equal clue VLOOKUP. As you can see, Membrane will actually pull it out for you. Then I’m going to select my key, which would be the A pillar comma and now I want to go over to the Sheet now and I want to select my range. Now, since I’m only looking to find volume, I simply need to go to the magnitude row here.
I don’t need to go all the way to the end. Now, if I was necessary to pluck more and more information, if I was necessary to gather a competitive compose or maybe the cost per sound, I could go all the way down as well. I really want to get the volume. I’m going to go to there and I’m going to push the comma key again. Now I want to find the indicator and I want to find all the values in the second column. So one now, two. I’m going to say the second column two, and I crave the exact value and what I need to add there is false. This will give me the exact value, which is in this column. I thumped enter and now it’s giving me an N/ A for this one, because there wasn’t a search volume for this query, but if I want to get all the top 100, I merely doubled sound this right here.
As you can see, it only crowds all that out for me. Now, I actually know the different volumes or how much transaction each one of these inquiries gets without having to copy and glue back and forth with my data.
Used for: Weighing number of references in a cell
Function:= LEN( insertcell)
The next perform we’re going to look at is LEN. LEN is used for include the number of references within a cell. This could be helpful to know how long a entitlement is if you’ve got a assortment of deeds within a Sheet. It too can assist you count how long your keywords are, things like that. This is very helpful, and it’s a relatively easy function to use. It’s equal ratify LEN, and then you use the cell. Let’s take a look at how to leverage this. Using the same data, if we wanted to see how long these queries were, we have been able to included an additional column to the right, and we could call it length.
Here we’re just going to use the LEN function, so L-E-N, LEN, duration of a fibre. We’ll select the cord here in that parenthesis. It tells us exactly how long that specific string is. A fibre is just a set of courages like textbook personas. This can be helpful, like I said, when we’re looking at entitlement calls, things of that, where we want to optimize around specific duration. It’s an easy perform. It’s easy to add extra data in and help us to get, honestly, a lot more information out of our study. So far we know how long each of these keywords are. We’ve got some volume data and we’ve mixed this with the data that we’ve already plucked from Search Console. It’s compounded two sections of data. Now we have a lot more information to work with. But let’s talk about another function that we can leverage.
Used for: Dividing data around a specified character or string, and leans each scrap into a separate cell in the row.
Function:= Divide( insertcell, delimiter)
Sometimes we import our data and we have a number of data points within a single cell. In order to be allowed to to get wise out, we need to use the SPLIT function and the SPLIT function will make that data and use a specific character in order to draw it into fragments and separate it into cadres in a row. In order to do this, we use the SPLIT function and we can use that by selecting a specific cell and using the delimiter and separating that cell up. Let’s see how this works. Here, we’ve got a number of segments of data that we’ve exported from our tool. As you can see here, we’ve got tends, but all of the trend line data is actually in a single cadre. In its present form, it’s not extremely useful. It’s candidly hard to differentiate and truly get some appraise out of.
We need to split this information up. The mode we would do that is equal ratify SPLIT and we click on that. Then we select our cell. Then what is the delimiter? We are looking at it now and we can see that it’s split up by a comma. We would use the following syntax. If it was a semi-colon, we would use that, but we use the repeats and then what is splitting the data up, and then we would end the quote there. That method it knows how to break up the data for us. Hit enter. As you can see, it’s split all this information out. A quick hacker, if you really doubled click the area, it’ll actually fetching all that information down below you. Now we’ve got all of the trend data divide out into individual rows and cells.
Used for: Making a quick sparkline graph right within your sheet.
Function:= SPARKLINE( insertcell compas)
The SPARKLINE function studies really well after we just exercised the SPLIT function. A quantity of hours, I’ll get data like that. That is pulled from SEMrush. Now I want to actually see that trend line data. If I’m inside of SEMrush, I can see it, but if I actually export the data and I put it into a Sheet, I just see a knot of numbers. Sheet has this really cool function, which allows us to create a quick SPARKLINE graph right within a cadre. This can actually help us recognize the data and visualize the data, which can allow us to get a lot more meaning from it. Let’s look at the SPARKLINE function. Now we have all this trend data and we’ve pooled it out here, but again, it’s still not super handy if we’re just looking at it. It can be very overwhelming.
We genuinely only want to see this trend data in the form of a SPARKLINE. We use the equal clue and start typing in SPARKLINE and Membrane will bring it up for you. Then you only pluck the stray of data that you want to be used within the SPARKLINE and close it. Now you can see, I’ve got this really cool little diagram now that sees me the trend line. This is really helpful to know which keywords are veering up as well as which ones are tending down. We don’t have to mess with all of this additional data. We can see it right here within our Sheet. And now we know very quickly, which expressions maybe we should target now as opposed to not worry about as much or at least understand the trends with the data itself.
What’s really nice about working Membranes is you can do a lot of analysis from within one document. You don’t have to rush all around and go to different implements if you know how to really leverage the Membrane. Once you have your data, you can manipulate it quite a bit. We’ve seen this with VLOOKUPS and separating our data and creating SPARKLINES and looking at how long references are.
Used for: Attracting data from any of various structured data characters including XML, HTML, CSV, TSV, and RSS and ATOM XML feeds.
Function:= IMPORTXML( url, xpath_query)
ImportXML is another huge tool that we can use. This can actually be used to scrape data or pull in data from a number of structured data type, including XML, HTML, and quite a few others. The run, again, it’s not extremely demanding, but you do need to understand the different XPath inquiries you can use. Now, I don’t remember all of these, and this is why the internet is such an amazing thing.
We can examine those up so that we can really leverage this awesome function. You would type in ImportXML, you’d have a URL or a cadre, and then you would look at the XPath query. Let’s look at a couple of ways that we could leverage. I attracted some URLs from a hunting cause, and now I want to learn a little bit more about them. I require these all in a Sheet, and I want to make sure that I have their deed and their meta description. Let’s say I’m doing some keyword research, and I want to understand the SERPs a little bit better. In order to get the title, I’m going to use ImportXML. We’re going to go import , now make sure you choose the right one. This is ImportXML, and now I’m going to need a URL.
This is very important. I click on this cell because the cell’s a URL. I’m going to use citations and inside the quotations, I’m going to go forward slash, forward slash, claim. Now I make sure I close this and made participate. What it’s done is it’s inspected up that URL, and it’s leaved me back the entitlement tag. Again, I can get all the title tags for these sheets by double-clicking that, or precisely gathering that perform down. Pretty cool, right? You can do that very quickly. You don’t have to go back and follow and paste and do all that stuff that’s extra hard.
You can get rid of replica. Now you can understand for this query, which was like best screen record-keeper or something for Mac, then we can look at what sheets are grading and then what type of content, very. Right now, we can see that for all of these various inquiries, it seems to be listicle clauses that are grading really well. If we’re trying to rank for this, or we’re trying to match user goal, we might need to create a listicle piece of content as well. Let’s say we want to get a little more information here and we wanted to find the meta description.
We’re going to use the exact same function that we done before, ImportXML. And we’ll go ahead and use the URL again. Now I don’t have the meta description one memorized. And like I said, this is where Google is really great, but this guy has some pretty cool stuff on his locate, but he’s given us the perform right here. I can emulate that. Again, I want to kept this in double-quotes, paste it in, and there we go. Now we’ve got meta descriptions.
We can attract that down. By working ImportXML, I made such lists of URLs I previously had and I find all the entitlements and all the meta descriptions. Again , now we can use a little of the other functions that we talked about before. Let’s say I want to know how long these name tags were. I could do title duration and I can see how long the characters are, employing LEN, right? So you can actually do a ton of analysis while you never leave Sheets, right? Never formerly have I left Membranes and I plucked all this information. I can do the same thing for description span. So while having a crawler certainly is really good, Screaming Frog, or Sitebulb or anything like that, these Sheet that Google has for us are actually really helpful as well to do things very quickly if we need to do some quick analysis or we want to combine our data.
Used for: Gathering in data from APIs
Function:= IMPORTDATA( url)
The last-place serve we’re going to talk about is called IMPORTDATA and I often use this to pull data from APIs directly into Expanse. The serve is quite simple. It’s just IMPORTDATA. Then you have your URL, the API URL, the requests that you’re making. Then based on that request, you’ll get certain data points back. Let’s look at how we can leverage this. For this sample, I’m going to be using SEMrush’s API. All APIs are a little bit different, so it may not work perfectly for you in the exact same path, but I would look at some other ideas and some other things that people are doing with IMPORTDATA if you can’t do it the exact way that I am proving you here. But this is a really cool way to import lots of information into your Sheet, again, without having to go to other tools.
Once again, we’re here at this last-place Sheet where we’ve got URLs and we gather our designations and our metas, but let’s say I just wanted to get some more SEO metrics about these specific URLs. This is where I could use IMPORTDATA. Now there’s a little function that I’ll commonly use ahead of this, precisely because a lot of times APIs will stack data down and that’s going to break formerly we do that over and over, because it’s going to be replacing the data. I like to begin all of these with TRANSPOSE and this is a really cool one, more. This is actually seven, I’m giving you an extra one. TRANSPOSE will actually tell it, instead of returning the data downward, I want you to transpose it. It actually croaks left to right. You can stack these functions. Now that I have TRANSPOSE, I can also now use IMPORTDATA.
This is the function that we’re talking about. You’re going to need a URL to do that, and we’ll disappear get that next. SEMrush has an API. It is very filtered, I would say, but it can be really helpful if you’re trying to get some immediate metrics and understand how things work. In such cases, we’re actually going to be looking at URL reporting. We can look at specific URLs and I want to see some organic keyword metrics around these specific pages. You’re going to need an API key in order to make this work. If you don’t have an API key, you’re not going to come data, but it’s a relatively simple request. I would duplicate this and I would paste it in here. What this is saying, here’s the API key would be in here, I just wanted to flaunt a maximum period of 10 results.
I’m looking at a number of different peculiarities that we can do here. Now, I don’t need all these. I just really want to look at keywords and I want to look for a particular URL. In order to make this work, we don’t want to do SEO book every single time. We want to actually range the cell so “youve got to” do double quotes and then you have to do double ampersands and then it allows you to fetch this from a specific cell. I’m going to run this cell in one second with my API key, and I’ll show you exactly how the data drives. Now I’ve put in my API key, and now you can see that we have a couple of data points.
We have keywords, importance, and examination volume, which will all show up now and there proceeding left to right. Now I’ve got 10 keywords. What keyword they’re standing for, what place they’re in in the search volume. I can do this for my top 10 pages and it might make some time, it really depends on how the API flows, but now I’ve got a ton of keyword data on each one of these URLs. I’ve got some of the top calls they’re standing for, its own position they’re rank for them in, and the search volume. That was all done from IMPORTDATA. I exploited a little caveat as we remember, I use the TRANSPOSE function in order to allow me to push that data from left to right, because otherwise it would have gone straight down and it would repeat it over itself.
These are six potent capacities that we can leverage in order to get more office from Google Sheets. As an SEO, you probably work in Sheet quite a bit. You probably work in these different data points, and this is going to help you use your data more efficiently and more effectively. If you have any questions about what we talked about today, please comment below. I’d love to continue that speech with you and until next time, joyful marketing.
Read more: feedproxy.google.com