Zoominfo Scrapping through VBA

Scrap Data From Zoominfo Through VBA

Hi , did you ever tried to scrap data from web like bloomberg, google SERP, zoominfo, yelp, justdail etc. Well in this blog I am going to explain you an way how you can scrap data from zoominfo. I will also discuss about how to scrap data from other websites as well in my later blogs.

Well this time I am going to explain you how to scrap data from zoominfo by using excel VBA. If you are very much familiar with excel VBA then it would be very much easier for you. Don’t worry if you don’t have the knowledge of excel VBA much then also you do this trick and scrap a lot of data of your own wish. First of all open the developer tab of excel and insert an new module. To open the visual basic press the Alt+F11 key. After that at the top left corner you will get an option for insert, click on it and select module. And here’s go! you have inserted an new module. Firstly you have name your module, you can give any name of your own wish. In this case, I am naming as zoominfoscrap you can name it anything as you want, your code will start like this

Sub zoominfoscrap()

End Sub

Now you have to write an command to open the internet explorer and open the zoominfo page.

Sub zoominfoscrap()
Dim IE As New SHDocVw.InternetExplorer
Dim url As String
url = range(“A2”)

IE.Visible = True
IE.navigate url

End Sub

Keep in mind to select the correct reference in visual basic to run this code. For that go to tools>references and select the references as microsoft HTML Objects and internet control. Keep the zoominfo url at the cell A2 of your excel sheet or else you can do one more thing that is assign an url in code. For eg
url = “https://www.zoominfo.com/c/at-t-inc/3194954”

The above code will open the internet explorer and navigate you to the zoominfo link that you have provided. I will also explain you an way how to navigate all the zoominfo links at once and scrap all the data by using an loop at the end of this blog.And also, if you have thousand of companies and you want to scrap the data of those companies from zoominfo, you shall scrap the zoominfo link first. I will also discuss about how to get all the zoominfo links of thousand of companies at once later in my next blog. Once your internet explorer get opened now you have to write an command to your program to wait until the site is loaded fully.

Do While IE.readyState <> READYSTATE_COMPLETE
Loop

This above code will make your program wait until the internet explorer make that site loaded fully. Once you zoominfo link is loaded completely next you have to create an html doc file of that page for that you have to use the following variables in your code. In here I have given the variables name as HTMLDoc and HTMLItems, you can give anything else you want.

Dim HTMLDoc As IHTMLDocument
Dim HTMLItems As IHTMLElement

HTMLDoc will create the doc file and HTMLItems will be used to scrap the data from that doc file. I will show you how you will be able to do it. Now suppose we need to scrap the company standard name, street address, pincode, contact number from zoominfo of that particular company. In this following code I will give you an idea to scrap the company standard name firstly later on we will move on to the further part of it. If you go the zoominfo page of an company and just right click on the company standard name. You will get an option to inspect element and if you click on it then on the right side of your window you will the html code of that page. like this

Sub zoominfoscrap()
Dim IE As New SHDocVw.InternetExplorer
Dim HTMLDoc As IHTMLDocument
Dim HTMLItems As IHTMLElement
Dim url As String
url = range(“A2”)

IE.Visible = True
IE.navigate url
Do While IE.readyState <> READYSTATE_COMPLETE
Loop
Set HTMLDoc = IE.document
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(2)
Cells(2, 2).Value = HTMLItems.innerText

End Sub

This above code will scrap the standard company name given at zoominfo at 2nd column and 2nd row ” cells(2,2)”. HTMLDoc will get the doc file of that html page with code ie.document and after that HTMLItems will scrap the data based on the tag name that is the span tag. And company standard with the array number 2. Similarly you can scrap all the data given at zoominfo like the companies street address, contact number and etc. Just follow the following code.

Sub zoominfoscrap()
Dim IE As New SHDocVw.InternetExplorer
Dim HTMLDoc As IHTMLDocument
Dim HTMLItems As IHTMLElement
Dim url As String
url = range(“A2”)

IE.Visible = True
IE.navigate url
Do While IE.readyState <> READYSTATE_COMPLETE
Loop
Set HTMLDoc = IE.document
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(2)
Cells(2, 2).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(4)
Cells(2, 3).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(6)
Cells(2, 4).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(8)
Cells(2, 5).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(10)
Cells(2, 6).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(12)
Cells(2, 7).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(19)
Cells(2, 8).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(18)
Cells(2, 9).Value = HTMLItems.innerText

End Sub

The above is the basic code to scrap data from zoominfo. I will let you know that how to navigate multiple links and scrap all the data. For this you have to use an loop. First off all, put all the companies zoominfo links at the column A starting to 2nd row. After that you have make an little bit of changes on your code. Firstly you have to declare an variable i in the beginning of the code like this: dim i as integer
And after that you can the looping process as like this:
for i =1 to 100
//your program//
next i

Since we have pasted all of our links at column A . So basically our main task is make the code program that it will all the links one by one and scrap the data and will paste it in an given order. So for that firstly we need to find out our last row that till which row we pasted the links. For this we need to declare an another variable: Dim lRow As Long
And set the 1Row value as this to find the last row
lRow = Cells(Rows.Count, 1).End(xlUp).Row

Now since we have identified our last row so we can now make an loop through the last row and scrap all the data from those links one by one.
Following is the code I have to scrap all the data.

Sub zoominfoscrap()
Dim IE As New SHDocVw.InternetExplorer
Dim HTMLDoc As IHTMLDocument
Dim HTMLItems As IHTMLElement
Dim i As Integer
Dim lRow As Long
Dim url As String

lRow = Cells(Rows.Count, 1).End(xlUp).Row

For i = 2 To lRow
IE.Visible = True
url = Cells(i, 3)
IE.navigate url

Do While IE.readyState <> READYSTATE_COMPLETE
Loop
On Error Resume Next
Set HTMLDoc = IE.document
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(2)
Cells(i, 2).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(4)
Cells(i, 3).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(6)
Cells(i, 4).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(8)
Cells(i, 5).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(10)
Cells(i, 6).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(12)
Cells(i, 7).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(19)
Cells(i, 8).Value = HTMLItems.innerText
Set HTMLItems = HTMLDoc.getElementsByTagName(“span”)(18)
Cells(i, 9).Value = HTMLItems.innerText

Next i
MsgBox “done”
End Sub

Hope you have been sucessfully able to scrap the data. If you face any short of problem while performing this program, just drop your query at the comment section an we will get back to you right away. And thanks for giving your valuable time and reading this article.

3 Replies to “Scrap Data From Zoominfo Through VBA”

  1. Hi, Refined instal. There is an number with your website in internet human, could insure thisР’РЋK IE relieve is the activity leader and a broad factor to added people give omit your magnificent authorship due to this job.

  2. A fascinating act is definitely couturier sight. I anticipate that you should bare pol near this publicise, it might not be a prejudice helper but generally grouping do not interact hot these subjects. To the stingy! Forgiving regards!

Leave a Reply

Your email address will not be published. Required fields are marked *