Categories: SEO

The Ultimate Guide To GSA Search Engine Ranker!

GSA Search Engine Ranker

As A reader of my blog, you are able to save 7% on the retail price of any GSA tool! To get this discount use any of the links below or in my posts then use the coupon code “mt4.in7” on the checkout page.

GSA Search Engine Ranker

Welcome to my ultimate guide to GSA SER! I was actually toying with the idea of not making this post due to the amount of time and effort required to do it correctly. I have seen other people post less detailed versions similar to this but I am hoping my own in-depth version will not only be an excellent introduction to people brand new to the tool but also offer something for the more experienced user too.

I know that some of the stuff in here is very basic but I see no point in making an ultimate guide to a tool if someone completely new to using it can’t be linked to it, read it and have a decent idea of what they are doing. Additionally, I have added as many tips as I can possibly think of that I have discovered over the years so I am hoping that these can offer something of value to readers who have already used the tool.

After trying to work out the best way to present this post I decided to use the free tool Skitch to take screen grabs, overlay them with numbers and then do a breakdown of each number, GSA SER is such a massive tool that I doubt anyone other than Sven knows what every option does but I did my best to provide an answer for all options. In addition, I would also like to say that GSA Search Engine Ranker is not perfect but in my opinion, it is the best automated content management system link builder on the market today. So without further ado lets get started.

Lets Start At The Beginning Of GSA SER!

Here we have the main window of the tool.

1 – Centre Left – Project Pane

Once you have created projects and groups they will be displayed here along with the status of the project, the priority level of the project, the number of submissions the project has made as well as the number of verified URLs the project has.

2 – Centre Right – Last Verified URL Pane

Once your projects are producing verified URLs they will automatically be added here up to a maximum of the last 1000 verified URLs. The date and time the URLs were verified will be displayed along with the URL its self, the flag of the country where the domains server is located with a No/Do follow link breakdown at the bottom of the pane. Additionally once verified links are being displayed they will have different background colours, Green represents a do follow link and Yellow no follow.

3 – Bottom – Log Pane

When you activate projects their details will be displayed here, the date and time of the log entry are added along with the project making the entry and the entry message.

4 – Upper Right – The New Project Button

Pressing this button opens up a new project window for the user to create projects. The small drop down arrow to the right of the icon can be clicked to open a drop down menu. By default, the dropdown only has the options for a new project and a new project with the user setting its name. If the user chooses to create templates then they will also be displayed here as shown in the screenshot below allowing the user to click the template name and have its settings preloaded into the new project.

5 – Upper Right – The Edit Project Button

Once the user has created projects they are able to highlight a project in the project pane and this button will become enabled. Clicking the button provides the user with the same functionality as double-clicking a project in the project pane causing GSA SER to open the project for the user to edit. The edit project button also has a small downward pointing arrow to its right clicking this provides the same functionality as right-clicking on a project in the project pane as shown below.
Hovering over most of the options presents the user with additional options as shown below.
Active and Inactive – Turns a project on and off.

Active (Verify Only) – Project will only verifying already submitted URLs.

Active (Verify E-Mails Only) – Project will only check and verify emails.

Active (Use Global Site List Only) – Project will only pull targets from site lists if enabled.

Active (Search Only) – Project will only pull targets from search engines if enabled.

Active (Search and Verify) – No idea what this does.

Active (Remove Links) – Project will remove verified links.

Active (Re-Verify) – Project will reverify its verified links removing dead URLs.

GSA Search Engine Ranker

6 – Upper Right – The Delete Button

Pressing this while having an inactive project highlighted in the project pane will delete it, if the project has not previously been backed up there is no way to retrieve the project once deleted.

7 – Upper Right – The Options Button

Clicking this opens the project options window, I will cover the project options window later in the tutorial.

8 – Upper Right – The Start Button

Clicking the start button while having at least one project set to anything other than Inactive will start that project. The start button also has a little drop down arrow to the right of it that opens the drop down menu shown below.
Start – Project will start its actions.

Scheduler (All with Status Active) – Brings up the scheduler to manage projects set to active.

Scheduler (All Selected) – Brings up the scheduler to manage projects selected.

Scheduler (Select By Mask) – Brings up the scheduler to manage projects set by a mask.Here is a screenshot of the Scheduler window that will open when any of the scheduler options are selected.

The number of projects to run at the same time can actually be an important and often overlooked feature depending how you are using SER. If you are posting to a self-hosted blog network for example, then most content management systems will have built-in security features to prevent multiple logins at the same time from the same account on different IPs.

This means if you have for example, ten projects posting to your blog network then it would be a good idea to let the scheduler run one project at a time so that all projects are able to post to the blog network without having their connections interrupted by the content management systems security features.

Although the text says switch to next project after xx minutes this option will work for multiple projects. So if you are running “Scheduler (All Selected)” and have ten projects selected with the number of projects to run at the same time set to five for example after the allotted time has passed the scheduler will swap from the first five projects to the last five projects.

You are able to limit the scheduler to a user-defined time and date meaning you can set a scheduler up to active projects while you are away from your computer.

9 – Upper Right – The Help Button

Clicking the help button presents the user with various help options as shown in the screenshot below.

10 – Lower Left – The Project Pane Summary

The project pane provides some quick glance information for the user for the projects they have set up in that instance of GSA Search Engine Ranker.

P – Total projects in the instance, when projects are made active brackets appear with the number of active projects.
S – The total number of submissions made by all projects in the project pane.
V – The total number of verified URLs made by all projects in the project pane.

11 – Bottom Left – Thread Counter

The thread counter displays the total number of active threads GSA SER is currently using.

12 – Bottom Left – Total Submissions Since Last Reset
The submissions here differ from the ones in the project pane summary as they only return to zero when right-clicked by the user and manually reset. Although this may seem pointless it can be a useful tool to track the performance of your set up over the course of a day if the projects in your project pane change for any reason such as a project being deleted, using the reset data options or restoring projects from backup.

13 – Bottom Left – Total Verified Since Reset

The same as the total submissions since reset option but for verified URLs rather than submissions.

14 – Bottom Centre – Proxy Counter

The proxy counter shows the total number of private and public proxies added to this instance of GSA SER. For example, if you have 25 private proxies and 100 public proxies 25 | 100 would be displayed here.

15 – Bottom Centre – Statistics Per Minute Tracker

Right clicking this presents a dropdown where the user is able to select the metric to track as shown in the screenshot below.

16 – Bottom Centre – Captcha Statistics

This displays the total number of captchas encountered by GSA Search Engine Ranker and how each service handled them. For example, say you have GSA Captcha Breaker enabled as the first captcha service, a human captcha service enabled as your second captcha service and a Text captcha service enabled for your text captchas. SER would display the captcha statistics like this.

GSA Captcha Breaker, human captcha, [text captcha] – Incorrect Captcha Count.

The tracking is limited unfortunately but some text captcha services do offer in-house tracking on their dashboard of how they are performing with the text captchas they are presented with.

17 – Bottom Centre – Memory Usage

This is the amount of RAM GSA SER is currently using.

18 – Bottom Centre – CPU Usage

This is the percentage of the system CPU that GSA SER is currently using. I try to run my GSA SER instances on VPS’ or Servers where it will be hard for SER to take up over 50% of the CPU resources, even when blasting at 1800 active threads.

19 – Lower Right – Last Verified URLs No/Do Follow

This will display the total number of no/do follow links to be verified by your system up to a maximum of 1000, depending how you are using GSA Search Engine Ranker this can be a good metric to keep track of the health of your list and its do follow link ratio.

20 – Top Left – Build Version and Licence Owner

This option lets you view your build version of the tool and confirm the license owner.

Here is a screenshot of the tool with active projects running and the various fields populated with data. No idea if it will help but seeing where everything goes with running projects may help some people.

Moving On To The GSA Search Engine Ranker Options Window

21 – Left Side – The Submission Tab

Clicking this presents the user with the submissions window as shown above, by default this is the window SER will open when the options tab is first clicked.

22 – Left Side – The Captcha Tab

Clicking this presents the user with the captcha tab covered later in the tutorial.

23 – Left Side – The Indexing Tab

Clicking this presents the user with the indexing tab covered later in the tutorial.

24 – Left Side – The Filter Tab

Clicking this presents the user with the filter tab covered later in the tutorial.

25 – Left Side – The Advanced Tab

Clicking this presents the user with the advanced tab covered later in the tutorial.

26 – Upper Centre – The Thread Limiter

This allows the user to input the maximum number of threads they wish GSA SER to be able to utilize at once. The tool says that it is not recommended to use more than 100 threads at any given time but through testing, I have discovered the hard upper thread limit for SER is just over 2000 active threads.I am not sure why this warning message was added, some people say it is based on when users use SER to scrape search engines but all they would have to do to be able to safely run on a higher thread count is to add more proxies into the tool.

One thing to keep in mind is that GSA SER will class any active projects as an active thread removing that from the user set limit and may cause a few minor teething problems. For example, say you have 100 projects posting but you set your active thread limit to 100, all active threads will be being used up by your posting projects limiting the number of URLs you will be able to build with your campaign.

27 – Upper Centre – The Automatic Thread Restrictions

These options let the user define a hard resource limit for either CPU usage, RAM usage or both and if the tool reaches these limits then the active thread count will have a limit imposed on it. If you are using the tool on a home desktop or laptop these settings may be of use to you but if you are running on a VPS or Server then I strongly recommend that you leave these off as there have been plenty of people on the GSA forum complaining of problems and this is the root cause.

28 – Upper Centre – The HTML Timeout Option

This option lets the user set a custom HTML timeout when the tool tries to contact a domain. There are a number of factors to take into account when setting this. If you are running on a low spec VPS, laptop or desktop with a high thread count SER maybe taking longer to contact the website from your end.

Due to domains that end up on premium link lists getting hit so much they maybe running slow on their end. Most people recommend between 120 and 180 seconds but in all honesty, I have not tested this much because of the way I pre-filter my lists making sure my targets are up and healthy.

29 – Right Centre – The Bandwidth Limit

This option allows the user to set an upper bandwidth limit for SER so that it has limited access to the machines internet connection. If you are running on a desktop or laptop using your home internet connection then enabling this could be a good idea. If you are using a decent VPS or Server then they usually come with a 1GB unlimited connection and can take any punishment SER throws at it.

30 – Centre – Proxy Configuration

Clicking this option opens up the proxy configuration option as shown in the image below.

The proxy config window opens on its list tab by default, you can manually swap between the list and options tab using the two buttons at the top left of the window. On the list tab, the user is presented with two panes. The upper one is the proxy pane where the details of their entered proxies will be shown such as Host (IP), Port, Type, Status, Speed, Source, Private and Anonymous.

Below the proxy pane is the proxy log and it will show you various tasks the proxy config tool completes. Clicking the “Add Proxy” button on the centre left of the pane presents the user with a number of different ways to add their proxies to the tool.The “Test Proxies” button will do an access check on the proxies making sure you have the correct username and password, if that information is correct it will then do an alive check, if the proxy is alive it will the pull the data for the proxy to fill the different columns of the proxy pane.

The “Delete” button presents the users with the ability to delete all proxies, delete selected proxies, delete non-anonymous proxies or delete unchecked/not working proxies. Clicking the “Add/Edit Proxy Sites” button will present the user with the following screen.

In all honesty, I have never used this as I always use semi dedicated proxies with SER but I presume this window lets you select what sites SER will automatically scrape for public proxies. When using GSA SER I highly recommend using premium semi dedicated proxies as a minimum to get the most out of the tool.In the bottom left of the proxy pane is the checked/unchecked boxes. The user is able to manually check/uncheck proxies in the proxy pane by right clicking them and any proxy that fails its alive check will automatically be added to the uncheck column.

If you are using semi dedicated proxies and some do fail their alive check and are added to the unchecked column I recommend you manually right click it and mark it at checked so SER is able to use the proxies.

Provided most of the proxies passed the alive check there is a high chance there was a temporary problem with the particular proxy that failed and it will soon be fine meaning SER will be able to use it with no problems, if all the proxies fail double check your username and password.Some proxies require IP authentication so you will have to get the IP of the system you are running the tool on then login to your proxy

Some proxies require IP authentication so you will have to get the IP of the system you are running the tool on then login to your proxy providers dashboard and manually add your IP to the whitelist to enable the proxies. If you are still facing problems after doing this then I recommend you open a support ticket with your proxy provider.Clicking the “Options” Button in the top left of the screen will open the proxy options pane as shown below.

Clicking the “Options” Button in the top left of the screen will open the proxy options pane as shown below.

The proxy options pane allows the user to set up the settings for GSA SER to go out and scrape public proxies. Although public proxies are recommended for scraping search engines I do not recommend you use them for anything else as they get used by many people meaning they are slow and go offline sporadically. If you are doing a SER blast this is the last thing you want as it will just slow everything down and may affect your verifications if a public proxy goes offline while verifying a link.

31 – Lower Centre – Proxy Usage Options

This is where the user is able to set their proxy usage, there are a number of ways that people can set this up but I personally believe that the way it is set out in the screenshot is best practise.If you use a premium list service and a catch-all email service for making submissions then you want to be using private anonymous semi dedicated proxies to get the best of your services. Not only does this keep your submission speed up but it also hides you from anything on the other end trying to track where the submission is coming from.

As you are using a premium list for your targets, search engines should not be enabled within your project options tab as it will slow your system down meaning there is no need to enable any proxies for search engines as you won’t be using them.PR is dead and I know SER has a few tricks up its sleeve such as the Yandex TIC adjustment but I have seen no point in it with the methods I have us to date so again I use no proxies here as I do not currently use any PR based filtering.When verifying, you can enable your semi-dedicated proxies if you like but in theory, this adds to the bandwidth and depending on the size of the web page at the other side can slow you down.

For example, say you have 50 semi dedicated proxies set up and you are running 1000 threads on your SER instance. There is no way to be 100% sure what your threads are doing at any given time but for arguments sake, lets say that 500 threads are being used for submissions and 500 threads are being used for verification. The way I have it set up the 500 threads are using the connection of the VPS to verify the submissions leaving the submitting threads to use the semi-dedicated proxies at a rate of 10 threads per proxy.

Now let’s say you choose to tick your private proxy box for verification, now you have 500 threads being used for submissions and 500 threads being used to verify. The 500 verification threads are now using your VPS connection but hiding behind the semi-dedicated proxies that you have set up as your private proxies taking up their bandwidth too. With 500 extra threads running over the same proxies they could slow down meaning you miss more sites as your HTML Timeout option will trigger more often.

As I use catch-all emails for my SER projects there is no requirement to use proxies for email checking as the service is set up and hosted by the service provider for their users and they know exactly what we are doing with their service so it saves you bandwidth on your proxies as well as money as one catch-all in theory can receive infinite emails.

That being said though, if you are using accounts from a popular western email service such as Gmail then it might be a good idea for you to use proxies to try and reduce the number of times any given IP address hits their service as I suspect they will have something on their backend to block the email account if the same IP is logging into a bunch of different email accounts per minute for a few days in a row.

32 – Bottom Left – Backup and Restore Options

This option allows the user to back up or restore their options settings, although this may seem basic it is actually a nice feature to have, especially if you are using multiple GSA Search Engine Ranker instances.

For example, say you have an optimized options config that you like and are happy using. You can click backup, transfer it over dropbox or whatever file sharing service you are using to link your VPS’ together and then load that same config onto a brand new SER install on a different machine saving you the time you would need to set all the options up as you like.

Another use is to back up your settings before your tweak things, especially if you have just started using the tool. It was unreal how many times I would change a setting and not remember either the setting I changed or the value it was previously set at when I first started using the tool. This option removes both problems as you can simply restore from backup.

A third option is that you could in theory, back up your options and send it to a friend to install on their SER but I would not recommend it as your instance is tweaked for your proxies, your system specifications and the site list you are using. Giving a backup of this to a friend may do more harm than good, especially if you are using a $15 VPS or a $45 server and he is using his old laptop from 2007 that struggles to run minesweeper never mind SER.

On To The Captcha Tab

33 – Upper Left – The Image Captchas Tab

This lets the user swap between the image captchas tab and the text captchas tab, by default SER will open the image captchas tab.

34 – Upper Centre – The Text Captchas Tab

Similar to above but the user can click this to go to the text captcha tab shown below.

The user can add various text captcha services in this window as tools such as GSA Captcha Breaker is unable to solve them so additional services are required, this is because text captchas are based on random sums or questions generated by the website and then presented to the user as shown in this .gif.

35 – Centre – The Captcha Service Pane

This is a visual display of the various captcha services that this instance of GSA SER has access to.One thing to note, at the time of writing SER is only able to select one captcha service, two captcha services or all captcha services to be using on projects. Although this may seem trivial is can potentially be a big problem depending on how you use SER.

For example, say you have the SEREngines plugin, you are also using your SER instance to build contextual article links and noncontextual links. Ideally, you would have it set out so your noncontextual link projects can only use GSA Captcha breaker as it’s solves are essentially free once you have purchased the tool.

Your contextual article projects would initially use GSA Captcha Breaker and if the captcha is not solved it would then be passed to an OCR Captcha service and your SEREngines projects would send their captcha directly to a human captcha solving service.

In theory, you could add your OCR Captcha service to GSA Captcha breaker and tell it to send only captchas such as ReCaptcha to your OCR service for solving. The problem here however, is that there is no control at the project level so a non-contextual project could encounter a ReCaptcha, send it to GSA Captcha Breaker to be solved and then GSA Captcha Breaker sends it to your OCR captcha service taking up your threads meaning the ReCaptchas for your contextual article projects are unable to get through and be solved.

In a similar way your SEREngines projects only really want to use your human solved captcha service but the only current way to do it would be to have its captcha settings in the project options tab set to all meaning that first its captcha is sent to GSA Captcha Breaker and the captcha fails, it is then sent to the OCR Captcha service and it is highly likely it fails there too and then finally after all this time it gets to go to the human solved captcha service to be manually solved.

I made this suggestion on the GSA Forum that has now been implemented and this could potentially be a work around for the issue. The problem is SER does not give much information on what it classes as a hard captcha to solve so some of the captcha presented to the SEREngines projects may not be classed as a hard captcha for SER to solve and then it has to go through the above process to get solved.

I have made this suggestion over on the GSA Forum regarding this matter and Sven has confirmed it is on his long list of things to do so let’s hope we get it before SEREngines comes out of beta testing.

36 – Right – CentreThe Captcha Service Controls

The “Add” button allows the user to add captcha services to the captcha service pane and the “Delete” button allows the user to delete captcha services. The “Up” and “Down” buttons allow the user to move captcha services up and down in the list if they have multiple captcha services added.

This is useful as SER works from top to bottom when processing captcha services and in the project options tab you only have the options of first captcha service, second captcha service of all. The first service is the top one, the second is the second one and all will run through the services going top to bottom. If by mistake you accidentally add your human solved captcha service to the top you can press the “Down” button so GSA Captcha Breaker for example, is above it doing the heavy lifting for you and preservice your human solved captcha credits.

37 – Lower Middle – Successful Captcha Saving Tab

In all honesty I have never used this and I don’t really know what I would use it for, I am yet to try but if the captcha name is saved as what the captcha service replied saying it was then it could be a good way to test OCR Captcha Services to see how accurate they actually are.

38 – Lower Middle – The Pop Up Of Death

This tick box allows the user to sit at his computer watching SER work and if the captcha services get a captcha wrong it will present the captcha to the user for him to solve it. Even with a pretty low thread count the popup box is always there requiring user attention so I always have it turned off.

If you are hitting domains with ReCaptcha on them while using a low number of proxies with a high thread count the ReCaptcha system will detect the same IP hitting their service over and over again forcing it to return harder and harder captchas that will quickly drive you insane.

39 – Lower Middle – Simultaneous Captcha Setting

In all honesty, I have never changed it and I presume it is the number of open captcha requests you can have between SER and your captcha solvers.

Moving On To The Indexing Window!

40 – Upper Centre – Submit To Search Engines

I haven’t used this setting, I suspect it attempts to submit the URL to the search engine submit URL page but I would imagine there would be proxy issues if it was enabled the speed you can kick out verified URLs with SER.

41 – Middle – Search Engine Submit URL Pane

This displays the various search engine pages where people are able to submit URLs to be crawled by the particular search engine spider.

42 – Right Middle – Submit URL Pane Control Options

Similar to the captcha service options previously covered.

43 – Middle – Premium Indexer Details Field

This is where the user is able to add the details for their premium indexing service. Due to varying performance between link indexer services I plan to maintain a blog post listing indexer services by there index rates.

44 – Bottom – Index Submission Control Options

This offers the user come control over the links that are submitted to their premium indexers including a drip feed option. The way I filter my lists makes the do follow option pointless, PR is dead now so we are only left with third party services to guesstimate what PR would be but the “Send only certain engine types” could be useful to prevent you wasting premium indexer credits on things like blog comments.

Moving On To The Filter Tab

45 – Top Centre – Filter Enable Tick Box

This enables the user to enable or disable the filtering of domains on the blacklisted websites in the blacklisted pane.

46 – Middle – Blacklisted Pane

The user can tick or untick various websites that list blacklisted domains to have them removed from their projects. In all honesty, I can’t remember the last time I enabled this. In theory, if you are pushing at high LPM and SER is checking every submission against these lists it will slow you down but on the flipside it may protect your tiers so it is down the the user what way they choose to go.

47 – Lower Middle – Blacklist Control Options

This enabled the user to change the update interval of the sites in the blacklisted pane as well as update or delete sites.

48 – Lower Middle – Download Size Limit

The user has the ability to choose the max size of a website they wish to download. Although it is a personal theory when doing a non-contextual blast all I case about it getting as many links out there as possible for the search engine spiders to crawl up my tiers so I set this to 1 MB.

When doing contextual article blasts I increase this to 5 MB as I care more about link yield than speed then. This is purely out of habit from when I first began using GSA SER but I have stuck with it all these years.

49 – Upper Left – Folder Actions Tick boxes

GSA Search Engine Ranker has four hard coded purposes for each of the four folders the user has available to them. If the user chooses to tick these tick boxes then these hard-coded actions are enabled for the various folders. The hard-coded actions are as follows.

Identified – Saves all the targets that it scrapes and then identifies as usable targets.
Submitted – Saves all the targets that GSA SER was able to submit a post to.
Verified – Saves all targets that GSA SER was able to verify a link from.
Failed – Saves all targets that failed submission.

If you are using SER to scrape its own targets then ticking identified, submitted and verified is required for SER to automatically build out a verified list for you.

The way I run my SER instances I personally feel it is best practice not to use any of these hard coded functions. If you are using a premium target list then you won’t be using search engines to find targets as it will slow you down, there is also a fair chance you will be mapping the  folder path of the premium list to the identified folder for SER to pull targets from for your live projects so you won’t want to be writing to this folder and its relatively clean list of targets.

If you are using two premium lists as I currently do then there is a fair chance you will be mapping the second list patch to the submitted folder to pull targets from. If you are using a secondary list mapped to the submitted folder then you don’t want to tick the option to save submitted targets to the relatively clean list you have mapped here. Also, If you are filtering a list the way I recommend then there is no need to save submitted targets as you will be skipping that step of the building process.

There are two trains of through on the automatically save verified targets tick box, personally I leave it unticked because of my list filtering method and manually save the targets I want from that process to the verified folder. However, some people do like to enable this option and just let GSA SER automatically save all the verified targets to the verified folder. The thing is this option removes all control the user has in keeping their list clean.

I have never personally seen any reason to auto save failed targets so I have never ticked this box but if you are filtering a list as I recommend then this folder can become useful for holding targets.

With my filtering method, you will be manually choosing the targets to keep that are produced within filter projects. Amongst other things you can manually choose the do follow and no follow links from those projects, I like to maintain control over what my GSA Search Engine Ranker instances are producing and the only way to control if the links are do follow or no follow is to keep their verified targets separate from a pre-filtered project. To do this I manually save all do follow targets to the verified folder and all no follow targets to the failed folder. This enables me to maintain control at the project level what will be built with regards to do follow and no follow links.

50 – Upper Centre – Mapped Folder Paths

This displays the paths the user has mapped for their various folders. The user can click the little drop down arrow to the right of the path to remap it to a different location.

51 – Upper Right – Open Mapped Folders

Clicking any of these buttons will automatically open the mapped folders giving the user the ability to edit them as they require.

52 – Middle Left – Save PR With URL Tick Box

Ticking this will have SER save the PR of a link against its URL, I leave it unticked as PR has not been updated for years and in theory doing this for every URL can slow SER down a little.

53 – Middle – Target .txt Format

This lets you change the format GSA SER will look for your target .txt files in any of your folders. Some list sellers will provide their list in the [name].txt format while some others will provide it in the [type]-[name].txt format.

They are not interchangeable and unfortunately there is currently no way to run both at the same time, thankfully both this and this list are both provided in the same format so I can use both at the same time.I have seen a few threads on the GSA forum and completed team viewer sessions with people who were having problems with SER and when it came down to it they had one option ticked while their list was provided in the other so SER was not picking up targets for them from there folders. A simple change in radio box and the fault was fixed and their SER instances took off.

54 – Middle – Control Options

As I have said multiple times now PR is not updated so I leave the PR options unticked. Sven recently added the ability to use the Yandex TIC score to mimic PR. I know some users have this enabled but I currently see no reason to with the way I use SER.

Also, the Googles Page Rank metric and the Yandex TIC metric will both have different algorithms so there is no guarantee that the things Yandex sees as strong link factors is the same Google. Also, in theory enabling this could slow SER down if it is being used when building a high number of links per day.

The “Enable Important Messages for projects” tick box is the option responsible for those little Red triangles that appear next to your projects now and then. This is another option down to user preference. I know many users who just turn it off but I leave it turned on as some of my projects will purposefully be provided with a limited number of target domains to posts to so I want to see the warning that it has completed its target sites.

I have no idea what the minimize to tray tick box does but I’m guessing it minimizes something to tray.

I leave the “detect internet connection problems and stop/restart projects” option ticked out of habit from before I used a decent VPS but in reality there is no reason to have it enabled with the services I use these days.

55 – Third Party API Key Access

The user is able to add their API keys here for access to various services. Out of these four the only one I have tried is SEREngines, version one of SEREngines was a total waste of time in my opinion but although still in beta at the time of writing SEREngines version two does show promise so here’s hoping.

56 – The Tools Button

Clicking this button presents the tools options as shown in the screenshot below followed by a little breakdown of what the options do.

Add URLs from Projects
– Pulls either the submitted or verified urls from projects and adds them to their respective folders.

Add URLs from Backups – Pulls either the submitted or verified urls from project backups and adds them to their respective folders.

Import URLs (Identify platform and sort in) – Gives the user the option to import urls from either a file or their clipboard, SER will then attempt to identify the platforms of the links and sort into a folder. This process can lock SER up for a long time depending on the number of links being imported and slow the tool down massively as it can be a resource hog.If you are wanting to build your own lists then I highly recommend you invest in GSA Platform Identifier as it has this functionality as well as so much more all in one tool and best of all it runs as a stand-alone tool meaning no resources are taken from GSA SER.

Import URLs (holding site lists) – I have no idea what this does and as far as I remember have never used the feature.Search Online For URLs – Clicking this opens up the search settings box as shown in the screenshot below.

I have never actually used this feature as I use Scrapbox to scrape for any URLs I need but I would imagine it is similar to the default scraping GSA Search Engine Ranker does within projects but offers the user a little more control.

Search Online For Site Lists – At the time of writing it seems to bring up the same options for the option above, not sure if this is a bug.

Parse verified URLs (others linking on same URL) – I have no idea what this does and to my recollection, I have never used the feature.

Import Site Lists – Allows the user to quickly and easily import a site list back up directly into either the identified, submitted, verified or failed folders.

Export Site Lists – Allows the user to quickly and easily export a site list directly from their identified, submitted, verified or failed folders. This is actually a useful feature to back up your site lists and is part of my GSA SER scheduled maintenance.

Manage Site Lists – Allows the user to move, merge or move and merge the site lists stored within their folders.

Footprint Studio – Clicking this opens up the GSA Search Engine Ranker footprint studio for the user as shown in the image below.

When you first start using SER this is not really an important feature to you as there are more important things to learn but as you progress things like custom footprints and custom engines become more important and this the footprint studio is involved in both. In addition, the footprint studio is also useful if you choose to build your own lists as it offers the user the ability to select a platform and get all of the default footprints SER has stored for that engine.

Remove Duplicate URLs – Selecting this presents the user with an options box where they can select the folder or folders they wish to remove duplicate urls from. Once the folders have been selected the user is presented with an engines option where the user can select the type of engine they want to remove duplicate URLs for. Off the top of my head I can’t think of any reason why you would not want to remove duplicate URLs for all engines and this is also apart of my GSA Search Engine Ranker maintenance.

Remove Duplicate Domains – Similar to the Remove duplicate Urls option this option will present the user with a folder selection box followed by an engine selection box. It is useful to note that engines such as blog comments and image comments are unticked by default as these engines post to pages one domains and thus a single domain can hold multiple valid targets for that engine. Again this is part of my GSA Search Engine Ranker maintenance.

Remove duplicate URLs+Domains (based on engine) – This feature was actually suggested by me a while back as I was sick of having to use each of the above options once as a time during maintenance. Essentially it will run the remove duplicate URLs option with all folders and all engines selected and then instantly run the remove duplicate domains option for you skilling the engines such as blog and image comments but you only have to click it once.

Remove From List – This gives the user the option to remove an entry or entries from either a single or multiple folders. The things to be remove can be imported either from a file or from the users clipboard. Once the user has selected their import option they are presented with the folders they wish to run the function on.

The importance of this feature totally depends on what you are doing with your SER instance as it can be a totally useless feature or save your SER instance hours of time.

For example, say you are using SER to build out your own list. You can use the footprint targets or the analyze competitors backlinks option in the project options to get targets, both of these methods are guaranteed to return many false positives for platforms SER will never be able to post to such as web 2.0 platforms like WordPress.

You can then save a bunch of common web 2.0 domains to a file, select this option and then select the web 2.0 file and SER will remove all of these web 2.0 targets that have been selected. I do something similar when using GSA Platform Identifier depending on how you are gathering targets and the values you have saved in your purge files this single process can reduce the URLs required to be processed by over 60%, that’s 60% less time required to process that scrape because you completed a simple quick and easy purging task. That being said if you are just purchasing premium GSA SER lists then this option is pretty useless for you.

Clean-Up (Check and Remove none working) – Similar to the remove duplicates options above this should form part of everyones SER maintenance. When clicked the user is presented with the screen below.

The user is able to select the engines they want to run the purge on, by default I just leave it as everything. Then the user is presented with an option for the folders they wish to run the tool one, depending what I am doing at the time I will run it on either all folders or just Verified and Failed if that is where I have stored my Do follow and No follow targets after completing my list filtering process. Once that is done the user is presented with the engine specific .txt files they wish to run, I usually leave it as all and then press ok.

To my knowledge, the tool then runs off and alive checks the domain, if the domain is dead it is removed, if the domain remains it then moved on to check for a valid footprint to workout if the target is still running a content management system SER is able to post to. If this is not detected then the target is removed, if it is detected then it is kept. You are then presented with a summary of the clean up as shown below.

I only started filtering the list last night do it is still small but you can see the clean up process has removed 47 targets for me.

Now you have to understand that this feature does have some draw backs depending on how you run your lists because domains that are on SER lists do go offline a lot due to the spike in bandwidth required and their hosting company freezing the account until the domain owner upgrades their subscription. This feature does not take this into account and if the domain is offline at the time of the search then it is removed.

A work around this would be to back your site list up using the export function above before running the clean up, then before every time you run the clean up tool restore that back up into and unused folder, merge the backup folder with one you are about to process using the option to merge above, remove duplicates from that folder using the option above then re-backup the folder replacing the original backup and then run your clean up. Personally, I just run this list filtering system and it works perfectly for me.

Remove Duplicates From File – As you may have guessed this option removed duplicates from a file selected by the user. Although Scrapebox has the functionality to remove duplicates natively I am not sure if other scrapers do as I have not used them so this could be handy for you to dedupe a file from another scraper or from a free link extraction tool such as Xenu link sleuth.

Join/Merge many files to one – As you may have guessed this joins many filed into one.

Filter entries from file – Although I have never used this option I think it is similar to the remove from URL list option above but works on a saved file rather than your folders.

Disavow Tool – I have never used this, and have no idea what it does.

Show Stats – Allows the user to choose one of their folders or two to compare to each other and shows the link count for each folder in a pop-up window.

Moving On To The Data Tab

Here we have the new project window that opens when you click the new project button (entry number 4 on this list). Depending on the engines you have selected in the engine pane yours may look a little different, also by default the two SEREngines options near the bottom as well as the SEREngines Beta option at the top will not be present unless you have a SEREngines subscription.

57 – Left – The Platforms Pane

This displays the various platform types available to the user, the platforms available can change depending on additional services such as SEREngines and custom engines. You can click on the plus sign to the left of any of the platforms for it to expand and display the various engines available for that platform as shown in the below Screenshot.

As you can see the background colour of the different engines changes depending on some engine specific variables that are explained at the bottom of the platforms pane. Right clicking on the platforms pane will open up a quick select menu for it as shown in the screenshot below, most of the options are self-explanatory but I will provide a brief explanation on ones with noteworthy attributes.

Uncheck Engines with No-Follow Link
– Un-checks engines SER things produce no follow links but it is highly inaccurate as domain owners customize their domains meaning they can make links do or no follow as they please.

Uncheck Engines that use no Contextual Links – Un-checks engines that do not have the ability to submit a full article via the script. When the user selects this options they are presented with the option to filter by “Articles Only”, “Profiles Only” or “Allow All”. There are a number of platforms in SER that in theory are able to post both an article and/or a profile to that content management system.

In reality this is rare as the webmaster of the domain will lock down the ability to post articles so you will just end up with profiles for that content management system although if the target is self-hosted and you import the login details into SER then you can use the tool to automatically post articles to that platform for you.Further to the platforms that can post both articles and profiles, you can use the uncheck engines that use no contextual links option into allow all and then disable profiles in the options tab so that you will only be left with articles.

Uncheck Engines Using Email Verification – This is an excellent option for two reasons if you are building out a tier with platforms such as blog comments, image comments, guestbooks and trackbacks as it deselects all engines that require email verification.

Firstly if you are not using a catch-all email provider then the costs for emails for services like yahoo or can quickly spiral out of control, selecting this option removes the need for emails for the project while still leaving you with a nice and high potential link yield while saving you money on emails.

Secondly, it removes the requirement for checking emails for the full project and speeds the process up massively. Personally, if I am using this type of tier all I want is as many links build as possible as quick as possible so having this option selected removes the time other engines waste on checking and verifying emails meaning SER is free to keep blasting making more and more links.

58 – Upper Centre – Data Tab

Opens the data tab, by default this is the tab SER will open.

59 – Upper Centre – Article Manager Tab

Opens the article manager window that I will cover later in the tutorial.

60 – Upper Centre – Options Tab

Opens the project options window that I will cover later in the tutorial.

61 – Upper Centre – Email Verification Tab

Opens the email verification tab that I will cover later in the tutorial.

62 –  Upper Centre – Notes Tab
Opens the notes tab that I will cover later in the tutorial.

63 – Upper Centre – External APIs Tab

Opens the external APIs tab that I will cover later in the tutorial.

64 – Upper Centre – Project URLs Field

This is where the project URLs that GSA Search Engine Ranker will build links for will be visible to the user. To the right of the field, there is a drop down box that can be clicked to view all the URLs currently added to this project.

65 – Upper Right – Project URLs Editor

Clicking on this opens the project URLs editor as shown in the screenshot below. The options are pretty self-explanatory but one thing I would add is that if you are wanting to change URLs in the projects I usually click clear first to completely clear the project URLs field and then click edit again and import from either file or clipboard. I am not sure if there is something in the code for this but I can remember way back when I first started using GSA SER it was considered best practice to do it like this and it just stuck with me out of habit.

66 – Upper Centre – Randomise URLs Tick Box

Ticking this tick box forces SER to post to the URLs in the project URL field at random rather than just work its way down the list. In theory, if you have a large number of links in the project URL field and are running on a slow machine there is a chance of a footprint but I tick it out of habit.

67 – Upper Middle – Tiered Link Building Options

Ticking the “Use verified URLs of another project {Multi-Tier/Linkwheel}” option presents the user with the below screenshot. All of the users other projects will be displayed and they will be able to tick the projects the wish to attach the open project to making the open project build its links to the verified links of the selected project.

Once a project has been selected for tied link building the “Edit Tier Filter Options” Box will unlock and when clicked will present the user with the options in the screenshot below followed by some explanations for the settings. I have my own system for tiering urls in SER but I will go over the options offered by default.

Use anchor text from the verified URL – Matches the anchor text to the tiered links anchor text so if the tier one links anchor text was “Test Anchor” then the tier two links anchor would also be “Test Anchor”.

Do Follow Only – Ticking this option will prevent the project from building links to any of the no follow links in the tier one project. For example say the tier one project has 100 links with 25 being do follow and 75 being no follow, ticking this option would make the project only build links for the 25 do follow links.

Indexable only – I have no idea how SER tries to work out if a link is indexable so I have never used this option.

PR must be at least – Forces the tier one projects URLs to meet a user-defined PR before building links to it.

Max amount of external links – Ticking this option checks the tier one verified URLs for it’s out bound link count giving the user the option to not build links to it if the links are over what they deem as too high.

In my experience different content management systems have a varying natural outbound link count, that coupled with web master site customization such as having a recent posts widget in the sidebar means a perfectly good tier one links page could have an example outbound link count anywhere from zero – one hundred or even higher.

On top of this there are a number of uses for links built by GSA SER, two examples are to pass link juice and to dilute anchor text. If your project is being used to dilute anchor text then you have no need to worry about outbound link count as you don’t care about link juice waste all you care about is getting that link indexed in Google.

Max amount of internal links – Similar to the external link option above this option checks for internal links on the page and again perfectly good tier one link pages could range from one to one hundred inbound links and again it depends on what you are using the project for.

URLs Age In Days – I have my own method for doing this functionality but in my opinion, this is an excellent setting. At the time of writing the two main problems faced by links build with GSA Search Engine Ranker is the problem of getting them in the Google index and link loss over time.

As I have previously mentioned when a domain ends up on a premium site list for SER there is a high chance the web master will take counter measures such as putting a harder captcha on the domain, changing the content management system, deleting the post, rolling back the domain or taking the domain offline.If the options to roll back the domain, take the domain offline or delete the post are taken then any link building to that URL has been a waste of system resources. This option lets your tier two campaign wait for the example a week before building any links to a tier one link giving the web master plenty of opportunity to remove the link and thus is the link is still alive there is a decent chance it will stick long enough to be useful and pass juice from your tier two.

Min URLs that the main project must have – Essentially gives the user the ability to define a minimum verified URL count that the tier one project must have before building tier two links to it. Off the top of my head, I cant think of a reason this would be useful.

Use only URLs from the following engine types – Gives the user the ability to tell the tier two project to only build links to the selected engine types such as articles, social networks and wikis.

Verified URL must have anchor – Allows the user to define the anchor that the tier one verified link must have in order to receive links from the tier two campaign. Although I would not use this personally, the theory is that you would put your exact match or LSI terms in here to strengthen the tier one links with those anchor texts even more.

68 – Upper Middle – URL Randomisation

These give the user the ability to add variation to the link, personally I don’t use these.

69 – Middle – The keyword Pane

The user can put keywords in here for SER to use to scrape search engines, another use for the keyword pane is the GSA Search Engine Ranker macro %keyword%. You can use this macro in your anchor text breakdown or your articles rather than typing the keyword out yourself and SER will automatically change it to your keyword.

Then if for whatever reason you want to change your keyword to a keyword variation then all you would have to do is type the keyword variation into the keyword pane and SER will automatically add it to future links build rather than you having to go through everything and change it manually.

70 – Middle – Scraping Options

Allows the user some control over how GSA SER will scrape the search engines, I personally use Scrapeboxto scrape so I have never used these options.

71 – Middle – Anchor Text Pane

Allows the user to import anchor texted words here. I am not sure but I think that if your anchor text breakdown that I would explain later in the tutorial does not add up to 100% then SER will fill the gaps with keywords from here. To my knowledge, I have never used this as I prefer to have my anchor text breakdown add up to 100% so I have a little more control over what is produces.

72 – Lower Middle – Anchor Text Breakdown

This allows the user a few options to manipulate the anchor text ratio of the links being created by GSA SER for the project. Although the fields are labelled up as things such as “Primary Match”, “Secondary Anchor” and “Branding Anchor” you can use them how you see fit.

One thing to note is that the anchor text breakdown here will not be exactly what you get. For example, say you build 100 links and have your anchors set to be 50% “Anchor Text 1” and 50% “Anchor Text 2” SER runs the campaign but some of the engines you have enabled for the campaign do not support anchor text, they force the link to have the url as its anchor so right away you have 40% “Anchor Text 1”, 40% “Anchor Text 2” and 20%”URL” with a total of 100 verified links for the project.

Then over time link loss kicks in and all of your “Anchor Text 2” links are lost meaning you are left with 60 total verified links. 40 of them saying “Anchor Text 1” and 20 of them saying “URL” at this stage you now have an anchor text ratio of 66.66% “Anchor Text 1” and 33.33% “URL”.Now this next part is up for debate and I personally feel that if a link is not in the Google index then it offers no value to ranking in Google and is not counted at all where as other people feel that provided the link has been crawled then it matters.

That being said let’s presume that out of your 60 links only 20 of them manage to get indexed and remain within the Google index and by chance all 20 of them are the ones with an anchor text of “URL”. In my opinion, this leaves you with 20 usable URLs from the project all with an anchor text of “URL”.

Now I know this is unrealistic but I chose this example as I have seen people on various forums complaining that when they check their anchor text rations using tools such as AHRefs their anchor text rations are way off.I hope this little explanation will help you understand how even though you have told GSA Search Engine Ranker to build you 50% links with “Anchor Text 1” and 50% of your links with “Anchor Text 2” you can in theory end up with 100% of your verified links that matter to Google with anchor text “URL”

73 – Bottom Centre – Test Button

In all honesty, I don’t know if this button is new but I cant remember ever seeing it before other than when I was marking the screen grab up in Skitch with numbers. That being said it seems like pressing it runs a soft test for the project and will present the user with any errors present.

74 – Bottom Centre – Tools

Pressing this button will present the user with the project tools options in the screenshot below followed by an explanation of what each option does.

Auto Fill – Presents the user with a number of options that they can enabled to auto fill the fields within the project that require content. There are various service available to import content from but I have always used Kontent Machine and will continue to do so unless the content it provides becomes a problem to index when compared to other content sources of GSA release their own content generation tool.

Spin Content – Due to using Kontent Machine I have never tested this button but I presume it will run all text you have entered into the project through the built-in spinner.

Footprint Studio – Opens the footprint studio we have already been over.
Import – Allows the user to import templates into the project and account data for privately hosted blogs.

Export – Allows the user to export the options above as well as make backups of the project.

Show URLs – Shows the submitted and verified count for the project.

Import Target URLs – Allows the user to import targets for the tool to build links on from a file, their clipboard or their site lists in the folders.

Scrolling Our Way Down The Data Tab

Here we have a screenshot when the data tab is scrolled down, remember that not all of these fields will be shown in your projects, these depend on the properties and engines selected for the project but for this tutorial I plan to go over as much as possible.

75 – Upper Centre – Login Information

Although I usually just let the randomly generated content from Kontent Machine fill these fields the Login option can be useful. Some of the engines in SER will name the account the same as the login information and then put this account name in the URL of the page.

For example, say you are wanting to rank for the term “Widgets”. You could enter the term “widgets” into your login field to try and increase the chances of you having your keyword in the URL of any pages build with SER. On the off chance other users are trying to rank for the same term as you and using the same tactic to try and get their keyword in the page URLs there are two options you can take.

Option one GSA’s random number macro %random-<min>-<max>%. To use this you would enter the following into the login field, “Widgets%random-<0>-<9999>%” and when SER is building your links it would take the term widgets and add a random number between 0 and 9999 to the end of it. The second method is adding spin tax to the end of the term widgets such as “Widgets{1|2|3|4|5}” and SER would use the term widgets followed by a number from 1 to 5.

Although unlikely it could be possible income very competitive niches that users are doing what you are doing and have hit the site 10,000 times meaning all variations for the term “Widgets%random-<0>-<9999>%”  have been used so you could double or treble stack the macro so it would look like this “Widgets%random-<0>-<9999>%%random-<0>-<9999>%”  giving potential outcomes of widgets followed by a random number between 0 and 9999 followed by a second number between 0 and 9999.

76 – Upper Centre – Password Field

This tab lets the user define a set password for the field. It maybe useful if for example you have the SEREngines plugin building accounts for the various web 2.0 properties it supports but you want to manually post to the accounts. You tell your virtual assistant that all the passwords will be whatever you define here and then export the account data and send it to the virtual assistant and he goes through and manually posts for you but in reality I see no real use for this.

77 – Upper Middle – Twitter URL

This option is not available on the engines I use so I don’t use it but I guess you have to put a Twitter URL here.

78 – Middle – Facebook URL

Same as above.

79 – Middle – About Yourself

The content placed in this field is used by SER to fill out the about me sections on websites. The content shown in the screenshot is the default content added by the projects and in all honesty I’m not sure but I think when content is imported from Kontent Machine this is over ridden by the content it produced.

80 – Lower Middle – Category Section

I am Pretty sure that this section is not available with the engines I use on my projects and the tooltip for it even says it is not used in many engines.

81 – Bottom Centre – Blog Comment

The content you put into this field is what GSA SER will post onto the blog comment targets for you. There are two ways you are able to use this, the first is to let Kontent Machine fill it with auto generated content when imported like I do. This is fine when using blog comments in the way I do, on throw away domains to get the search engine spiders to crawl my tiers.

Although I have never actually tried this myself yet the second is to actually put a niche relevant comment in here that is lightly spun. When using this second method you will be wanting to set up as many filters as possible such as a bad words filter, a PR filter, an outbound link filter. The goal if to let SER scrape for niche relevant blogs with good metrics that will be human moderated, if any meeting the filters is matched the hope is that SER will submit a blog comment that ends up being approved by the blog admin and sites on a page with good metrics and passes link juice directly to your money site. This is a tier one strategy that in theory is quiet safe but as I said I have not gotten round to trying it yet.

Scrolling Even Further Down The Data Tab

We are still on the data tab but we have just scrolled down a little further.

82 – Upper Centre – Image Comment

This is just the like blog comment above, this is the content that will be posted on an image comment, in theory the same principle can be applied here as with the blog comments to try find nice tier one targets but I just let Kontent Machine auto fill it and use the platform on the third tier.

83 – Middle – Guestbook

Just like blog comment and image comment this is the content that will be posted on a guestbook for the user and again the same theory applies with finding tier one links but again I only used it with auto-spun content in my third tier.

84 – Bottom Centre – Website Title

This acts similar to the login information explained above where the user is able to manipulate what ends up in the URL for some engines letting them get their keyword in the URL.

85 – Bottom Centre – Description

I would guess this is similar to the about me section explained above but I just use auto-spun content here too.

Even Further Down The GSA SER Project Data Tab

86 – Upper Centre – Profile Image

Allows the user to set a profile image for various platforms, I just leave it as default.

87 – Upper Centre – Forum Comment

Just like blog, image and guestbook but for forums. I have never tried automated forum marketing or this field so I am not one hundred percent sure but I believe the content put in here will actually be posted onto actual forum threads rather than forum profiles provided the option is selected in the project options tab.

88 – Middle – Reciprocal URL

I’m not sure if this option is usually available for the engines I use in all honesty, either way though I would just leave it as default.

89 – Middle – Guestbook Comment (German)

German Version of the guestbook comment from above.

90 – Bottom Centre – Micro Message

Even Further Down The Data Tab

91 – Upper Center – Forum Settings

Various forum settings the user can customize.

92 – Middle – Blog Settings

Various blog settings the user can customize, I would imagine the blog subdomain is similar to the login information I explained earlier in the tutorial.

93 – Bottom Centre – Video Settings

Various settings for video engines.

94 – Middle – Address Details
Address details the user can enter when posting to the crunch press platform but I never use it personally.

95 – Lower Centre – Video File
I have never used this option but I presume it is for when the user wishes to upload a video to the video engines.

96 – Bottom Centre – Signature
This is used for forum signatures when posting on various forum engines, I never use it personally and I would imagine forum posts made with SER and auto-spun content would be removed quickly.That is the end of the data tab as the rest of the options are duplicates of things above but in either German or Polish.

Finally On To The Article Manager Tab

97 – Upper Centre – Article Pane

This is where the articles will appear that SER will post for you. The user is able to manually add articles from the controls in point number 98 or import them from tools such as Kontent Machine as previously explained.

98 – Upper Right – Article Controls

The allow the user to add articles manually, manually edit articles, delete articles, export articles or preview what the posted article will look like.

99 – Middle – Link Location Options

This option decides where the link will appear in the article once posted. I use the option “just at a random location” to try an minimize a footprint and keep things as random as possible but you also have the following options.

100 – Middle Right – Anchor Text Prior To URL

I have no idea what this does and have never used it, from the tooltip it seems like it will add your anchor texts as plain text and then put your link as a naked url just after it.

101 – Middle – Insert Additional Links

To me the wording of this is a little confusing and I had to reach out for clarification here but what this option does is auto inject additional URLs from the projects URLs into the article for you.

102 – Middle – Insert Verified Links

As above this is also a field I had to reach out for clarification on, essentially what it does is try to build a link wheel for you using already verified URLs from the project. The problem is when it comes to link loss a fair few of these URLs will die leaving you will link juice pouring out of your tiers so I do not use this option.

103 – Middle – Insert Random Authority links

This option enables SER to automatically include “authority” urls into articles, from what I have seen it will just been a large community site with a niche relevant tag attached to the end of it but in theory it could help reduce a footprint depending what you are doing. You can tick the tick box below to enable search engine searches to try and find links to be used or tick the box to use your own URLs and manually find third party authority URLs to inject into articles with this being the option I go for.

104 – Middle – Insert Image

Enabled the auto injection of images into the articles GSA Search Engine Ranker will post for you. Be aware though, similar to how I explained anchor text some engines do not allow images to be added to their pages so you will not always get the number you request. As above you can also use your own custom URLs if you like.

105 – Middle – Insert Video

As above but with videos rather than images, again it seems even less engines allow the auto-inclusion of videos in articles being posted to them and just as above you can use your own URLs if you wish.As I have said many engines will not allow the auto-inclusion of videos on their pages so if you are wanting to run a campaign to embed a YouTube video you are wanting to promote it is probably better to use a service such as SEREngines or RankerX as their platforms are much more receptive to video injection.

106 – Lower Centre – Shorten Article

I have no idea why this option exists, in my opinion it is less relevant now more than ever as you want as much content as possible posted to your article sites to try and help with indexing.

107 – Lower Centre – Insert HTML Variations

I have no idea what this does and cant remember ever using it.

108 – Bottom Centre – Do Not Submit The Same Article More Than x Times

This setting totally depends on your strategy, If you are using SER to post to a privately hosted blog network or the SEREngines plugin with ages accounts to automatically post hand written articles for you then this can be a useful setting as it will prevent the same article being submitted more than one time and the possibility of a duplicate content issue.

If like me you use auto-generated content then the article will be full of spin tax meaning it is safe to submit the same article file more than one time as the spin tax held within it will change it each time SER unpins it. A while back I did a test with some auto-spun content and with a light spin the content was over 95% unique per 3000 uses so I never limit the number of times an article can be submitted.

On To The GSA SER Project Options Tab

109 – Upper Centre – Project Posting Options

Essentially this is a number of different ways for you to set up the posting options for the project ranking from submission or verification limits to time frames to make them in. The way I set these up totally depends on what I need the project to do for me.

For example if it is just a filter project I would not tick any of these as I just want the project post over and over again trying to get as many verified URLs as possible for me to push to my live projects.

However, my live projects almost always have a submission limit set around the per day mark.Something to note is that once you have filtered your list it would be a good idea to set up a few test projects with a submissions per day limiter and then see how many verified URLs the project get from the list. This is because using a submissions limiter is more accurate than a verified URL limiter.

For example, say you set up a limiter of 10 urls per say per target. With a submissions limiter the project would get to 10 submissions for the day and pause until the next day then post another 10, easy. With a verified URL limiter the project would start posting, it may post to for examples sake 50 targets before it goes into verification mode to see how many verified URLs it has built, the project then discovers out of those 50 URLs 30 are verified so it pauses waiting for the day to reset, while its paused another 10 URLs are verified meaning where you only wanted 10 URLs per day you now have 40.

Once you have this option enabled there is a per URL option to the far right, if you do not have this ticked then the limits you put on your posting options are for the project as a whole and all URLs you have it building links for. If you tick it then it is for each URL you have the project building links for.

To develop out example from earlier a little more say you have it set up to make 10 submissions per day and do not tick the per URL box. You have 5 URLs in the project for it to build links for so you end up with 10 submissions for the full project. For ease of examples sake lets say all 10 submissions become verified, you now have 10 verified links meaning 2 verified links created per link you have the project set up to build links for.

Now, say you have the exact same project but you tick the per URL box. The project has 5 URLs it is building links for so goes out and makes 50 submissions 10 per URL. Again, for examples sake lets say that every single one of them becomes verified meaning you have 50 verified links, 10 verifies links created per link in the project.

110 – Upper Centre – Captcha Services

This is the way you want this project to handle captchas. They range from ask the user meaning a bunch of more pop-ups for you to deal with to use all services meaning they would filter through the captcha services you have enabled in your captcha options from top to bottom as explained earlier in the tutorial and as explained in the captcha options section of the tutorial earlier your settings here will depend on what you want the project to do for you.

111 – Upper Centre – If A Form Can’t Be Filled

This setting tells SER what to do if it encounters a problem with a form during submission. In the screenshot it is set to ask user meaning more popups for you to deal with. I have seen people say they have it set to the “Choose Random” option on the off chance it created a verified URL but on my own live projects I have it set to the “Skip” option.My logic is that if there is a problem with a form then there is a chance the

My logic is that if there is a problem with a form then there is a chance the web master has did something to his website, if I choose random then I guess there is a chance it could create a verified url but SER maybe held up for a few seconds trying to work out if that random option worked. A few seconds may not sound like much but when you have 1800 active threads building links out those few seconds can quickly add up. When I use SER I am usually going for speed and volume so I am happy for it to just skip the form and move on to the next target.

112 – Upper Centre – Skip Hard To Solve Captchas
Ticking this box will get SER to terminate its submission and move onto the next target if it detects a hard to solve captcha such as ReCaptcha. As with many options in SER how I set this depends on what the project is doing, if it is a project for contextual links then I will not enable it as I want to try get as many contextual as possible but if the project is for non-contextual then I will tick it as there are plenty of non-contextual out there and I don’t want the thread help up for 10 or so seconds while the ReCaptcha is pushed through my captcha services for a simple non-contextual link.

113 – Upper Centre – Enable Custom Mode

I have never used this option but it is on my list for an up coming project. From what I understand it presents parts of the website to you so that you can manually type in an answer yourself. In theory, it could be good for a tier one of blog comments on high metric pages but as I have said I am yet to try it out.

114 – Middle – Verified Links Settings

I tick the “verified links must have exact URL” option if the project is for contextual as I want to know what contextual have been created and verified before I move them along in the link building chain.I usually have the “when to verify option set to automatically on my contextual projects so I can keep a better track of whats happening. If I needed a large amount of contextual links building within a short space of time then I would change the drop down to custom time and set it to verify every 6, 12 or 24 hours so that time is not wasted verifying the URLs every so often and the project is free to focus on submissions.

If I am using the project to build blog, image or guidebooks then I usually set the drop down box to never. When building out my tier three the only goal is to get links built on platforms that should already be indexed so the search engine crawlers climb my tiers to my money site. I don’t care if the link is actually verified or not, all I care about is submissions so I see no point in wasting time and system resources verifying the links as it slows everything down.One thing to note is that some blog, image and guestbook engines automatically verify so even though you have told the project not to check, you will get some verified URLs automatically.

115 – Middle – Re-verification options

This setting tells SER how often, if ever to reverify your links. If the project is for non-contextual links then I leave it off as I don’t verify them to start with. If the project is for tier one contextual links then I reverify it once per day so that my tier two project finds out if a link is dead or not quickly and doesn’t waste time and resources building out links to a tier one link that has been lost.If the project is a tier two contextual project then I sometimes increase the re-verification time frame depending what else is going on with that VPS as it is only non-contextual being built to them.

116 – Middle – Indexer Options

This option gives the user the choice of if they want to send their links to an indexing service or not. Due to link loss I have my own method of sending links to an indexing service. This is because the only way to currently send a link automatically to an indexing service is as soon as the project confirms verification.

As I have mentioned a few times link loss is becoming a pretty big issue with SER created contextual. In addition to that with the increase in cost to indexing services to counter the latest Google indexing update the costs of automatically sending verified links to my indexers only to lost the link to link loss became too great. I made this suggestion thread on the GSA forum, unfortunately, it never gained any traction but if you agree this is an issue feel free to go post your support.

117 – Middle – Character Spinning

I didn’t understand to tool tip for this setting when I first started using SER so I never turned it on and have never gave the option a second though.

118 – Middle – Randomly Add Typos

Essentially this randomly adds typing error to the content on posting to make it look more human. Personally I use auto generated and auto-spun content so I am fully aware if an admin inspects the post then it will be deleted, on top of that my theory is that this has to take time and system resources for SER to change the typos so I leave it turned off.

119 – Middle – Try to always place URL with an anchor text in description/comments

Never turned it on and never looked back.

120 – Middle – Retry to submit to previously failed submitted sites x times

Although this setting has been around for years I made this suggestion to improve it. Now the user is able to set a number of times that SER is allowed to try the post before dumping it.If your project is a filter project then it is a good idea for you to have this turned up to 25-50 as you want the project to produce as many workable URLs as possible so this lets the project try the domain xx times before dumping it.

If your project is for building out contextual articles to a live website you are trying to rank I would probably still have it set to around 25-50 as you want as many contextual as possible. If the project is to build non-contextual then have it set as low as 0-5 as you are going for speed rather than increasing referring domains with article based content.

121 – Middle – Try To Locate New URL

No idea what it does and have never turned it on.

122 – Middle – TAG Options

Some content management systems let the user submit tags with their post, this lets the user tell SER if you want it to use your keywords or anchor text from the Data Tab as tag submissions.

Further Down The Project Options Tab

123 – Upper Centre – The Search Engine Selection Pane

This lets the user select the number of and type of search engines they want to use for the project. The usage of this feature depends on how you are building your verified URL list. If you are using SER to scrape for targets then you will have to enable some search engines in the pane. You can right click on the pane to select search engines by country and a few other options.When doing so be aware that Google have reduced their number of hits to soft ban a proxy. Essentially this means if you hit a google domain with the same proxy within the a time frame they have set on their back end then the proxy is soft banned and won’t be able to connect to a google domain until the time is up. If you are running multiple threads over the same proxy each thread counts as a hit.

This is why most people use the free public proxies that SER can scrape in its proxy centre I discussed earlier to scrape with as they are free and you don’t care if they get banned or not. Additionally, scraping for targets with SER can slow the tool down as it takes resources away from other tasks.

When I scrape for targets I use scrape box with their free public proxies collected from GSA Proxy Scraper and totally hammer them with as many threads as possible to get as much out of them as I can before other people do and the proxy is soft banned. Or, I will load scrape box up with my semi-dedicated proxies and workout a thread count I want for the number of proxies I have available to scrape as efficiently as possible.

At the bottom of the search engine selection pane you can see it says 4/811 checked meaning it has 4 search engines selected to search with. If you are using a premium list and a list filtering process like I do then I right click the search engine selection pane and check none as I will be pulling targets from my site list folders.

124 – Middle – Target Acquisition Options

This lets the user have some additional control over how the project will pull its target links to post on.Always use keywords to find target sites – This option forces SER house a keyword when scraping a search engine, on my live projects I have it disabled.

Add stop words to query – Apparently ticking this gives you better search results but on my live projects I have it disabled.

Use URLs from Global Site Lists – This is my go-to source for targets, as explained earlier I usually have my do follow targets saved to the verified folder so I tick verified and let SER rip.

Use URLs linking on same verified URL – No idea what this does and I have it disabled on my live projects.

Analyse and post to competitors backlinks – Although I have never used this option in SER, to my understanding it is SERs form of link extraction and I have done loads of link extraction in Scrapebox. Be warned it is a very efficient process to grow a non-contextual verified link base but it is very resource heavy.

125 – Bottom Centre – Scheduled posting

This gives the user a number of options to schedule the posting of SER as well as allow it to post multiple times per account created and per domain. As usual how you set it up depends on what you want your project to do.

For example on a filtering project, I would not enable this as I don’t want SER wasting time posting to the same domains or accounts over and over as the point of the project is to verify as many domains as possible as quickly as possible.

On tier one and tier two contextual project I do enable this but your settings will depend on your risk tolerance. Allowing SER to post a high number of times on the same account means you lose all of those links if an admin checks one of the posts and realizes its auto-generated content where as a lower number of posts per account but more accounts per site spreads the load more.On a blog, image and guestbook project I crank it right up, to my knowledge most the endings don’t require a sign up anyway but the ones that do I don’t care if it gets banned, the google spider crawls these sites often enough for the links that get posted to offer value before they go.

The Very Bottom Of The Project Options Tab

126 – Upper Centre – Skipping Options

These offer the user various options to skip blogs based on things such as their outbound links,  PR of the link and PR of the domain.I have never used this in my SER Link building to date as I have always used to tool to go for volume rather than quality although I do plan to do some testing with stuff like this in the future.

127 – Middle – Backlink Control Pane

In my opinion, this is one of the most over-looked and underrated parts of the tool. When setting up a campaign you select the link types you want the tool to make and to my knowledge, it is the only tool that offers this level of control.For example, earlier in the tutorial I mentioned how if you are getting a high number of no follow BuddyPress links it is probably because you have the micro message option enabled here. I also mentioned how you can select engines that can post to both articles and profiles easier.

If you want those engines to post to only articles then make sure you only have the “Article” option ticked, if you want them to only post to profiles then make sure you have the “Profile-Contextual” option ticked and if you want it to post to both then have both ticked.A few other things about the backlink control pane, by default the option for “Article-Wiki” is not enabled meaning any wiki posts you get will just be profiles with low word count, I highly recommend you tick the “Article-Wiki” option on any projects you want contextual articles from wikis on else it’s not going to happen.

Secondly, the “Forum Post” option will try to make an actual forum post on the forum targets. If you are using auto spun and auto-generated content then don’t expect these to hand around for long as the forum moderators will delete them resulting in massive link loss meaning your time has been wasted.

128 – Middle Right – Additional Filtering Options

Try to skip creating Nofollow links – Ticking this is a waste of time, it does not work. Essentially when engines are created there is a variable that is set as to if the engine builds do follow or no follow links. This option checks that variable and then avoids the ones that say no follow.

That being said webmasters can modify their content management systems pretty easily to change the link output type. For example take an engine like Question2Answer where the vast majority of links are no follow. All it takes is for a webmaster to change his link output type to do follow and then his domain gives you do follow links.On the flip side take something like Drupal Blogs where most links are do follow by default. Having this option ticked will let SER post to Drupal Blogs but if the webmaster changes his link output type then his domain gives no follow links.

This is why I highly recommend filtering your lists, you process all the links available but separate the do follow links and the no follow links into different folders by their actual link output rather than what SER things it should make.

129 – Middle – Bad Words Filter

The clue is in the title, tick this and if there is one of these words on the page then SER will skip it.

130 – Middle – Bad Words Filter Domain/URL

Similar to the option above but this one actually checks the URL for the words rather than the on page content.

131 – Lower Centre – Skip Sites By Country

I’m not sure how this works but I presume it checks the server IP that the domain is held on and checks that against a register of countries or something. Anyway it gives the user the ability to skip sites based on their IP.

132 – Lower Right – Skip Site By Language

I have no idea how SER works out the language of the site but this option gives the user the ability to filter projects based on the sites language.

Moving On To The Email Verification Tab

133 – Middle – Email Pane

This is where you can view the emails or catch-alls that you have added for the project.

134 – Centre Right – Email Control Options

Various options for the user to add, delete, export, test and blacklist check their emails. If you are using email accounts be sure to test them when you paste them into your projects, you will be just wasting your time if you let a project run for a day before realizing all the emails you put into the project were dead to begin with.Due to the ever increasing price of email accounts and how quickly SER can burn them up I highly recommend you invest in a catch-all service.

They are very cheap with the basic package being around $5 per month and that basic package meets my needs perfectly. As I emailed earlier when you are doing the heavy blasting with blog, image and guestbook platforms most of them don’t require email sign ups anyway so the only thing really using your catch-all is your contextual targets.With traditional emails, you can only make one account per site but with a catch all you can make infinite accounts per target for $5 per month, its a no brainer.

135 – Middle – Email Editing Options

No idea what this is in all honesty, I usually just paste my catch-alls in and then let them go. I would guess it is a way to manually add emails or manually edit them.

136 – Bottom Centre – Email Options

These are pretty self-explanatory but as I mentioned earlier in the guide in the proxy options section, if you are using accounts like Yahoo then it’s a good idea to enable proxies for email checking and increase the time between email account logins to try and increase the life span of your emails.

Onto The Notes Tab

137 – Middle – Notes Pane

Taking notes and logging changes is essential in Search Engine Optimisation. You could use the notes pane to log your notes if you like but I personally save them directly to my rank tracking service as I can overlay them onto a graph of project keyword rankings to look for correlation.

The External API Tab

In all honesty, the only service I have used on there is FCS Networker a few years back before the owner sold it and the new owner let it go down hill so I changed my preferred web 2 link builder but if you use any of these services this is where you add the details for them.

Onto The Project Edit OptionsAfter all that I hope you now feel confident enough to set a GSA SER project up having a better idea of what things do. Once you have set a project up you will have a few new options when you right-click on the project in the project pane as shown below.

Set Status – Provides the same options as the edit buttons drop down arrow covered earlier in the tutorial.Set Priority – Gives you the option to set a projects system resources priority.

Show URLs – Lets you view the submitted and verified URLs for a project. The show remaining target URLs option is helpful to see how many target URLs are still loaded into the project for it to post to. Show failed re-verifications show is a way for you to track link loss and filter out and common domains from future link building. Show stats about remaining target URLs will provide a breakdown of the platform type remaining.

Select – If you have a bunch of projects in here then this offers the user a number of ways to select them. I won’t go into it here as I plan to do a full post on the importance of naming conventions in the future but select by mask is a massive help when you have a bunch of projects going at once.

Sort By – Allows the user to sort the projects how they see fit.Show Diagram – This option provides the diagram in the screenshot below display various stats about the links for the selected project or projects.

Import Target URLs – Provides the user with a number of ways they are able to import target URLs to the project or control the targets loaded into the project already.

Filter Log By Selected Projects – This is a helpful fault finding tool if you have a bunch of projects running at once select the project and click this and only this project’s data will be displayed in the log so you can see what the project is trying to do a little better.

Modify Project – This presents a bunch of ways the user can edit the project. A few months back I was using dedicated servers to knock out around 1.5 million links per day. After getting tired of doing daily servicing on the projects with the existing options I suggested the reset data option be added and thankfully Sven added it. If mass link building like that is something you are considering then the reset data option is an excellent tool to reset a blaster project so it is ready to go again.Refresh – I have no idea what this option does and don’t think I have ever used it.

Thanks For Reading

That concludes the ultimate guide to GSA Search Engine Ranker! If you actually managed to read the full post then I would imagine you have both the thirst for knowledge as well as the patience to get somewhere in this game!

Trending Right Now!

This post was last modified on March 19, 2019 1:07 am