At the monthly Content Conversations event by the Asia Content Marketing Associaton, brand marketers told Campaign Asia-Pacific the most common factors that hurt the abilities of digital web properties from ranking.
The top five factors cited are below, along with advice from APAC-based SEO experts on how to counter them:
1. Associating low-quality backlinks to a competitors site. Doing so signals to Google, Bing, and Baidu that the site has weak domain's supporting it, thereby hurting its ranking.
The fix: Marketers must regularly audit your backlinks on Google Search Console (GSC) to identify and spot any unnatural links early.
According to Shaad Hamid, head of SEO for SEA at APD, marketers that notice malicious links pointing to a site should either
- ask the website owner to remove the toxic link,
- disavow toxic links/domains using the link disavow file through GSC,
- do nothing if you haven’t seen any negative rank or traffic fluctuations.
"Google understands that there will be some percentage of ‘bad links’ pointing to any given website – it’s the nature of how the web works," said Hamid. "So if you’re confident that 90% of your links are healthy and from highly authoritative websites, then a few bad links can’t harm you."
2. Content scraping or duplication, whereby content is replicated from a competitors site, confusing the search algorithm as to the authenticity of a site.
The fix: Hamid says that Google is good at assigning credit to the rightful author. He recommends that a marketers time and resources would be well spent in building and growing a community, instead of chasing after content scrapers - an activity that is a drain on resources. in the long run.
For the short run, he recommends three fixes:
including internal links to other web-pages on the site, signalling the crawler of a search engine that the content is related. It also helps a site owner spot content scrapers as they will usually keep the links within the content while scraping.
Hamid recommends that marketers monitor GSC regularly to quickly spot these links and take corrective action such as contacting the website owner and asking them to take down the content or filing a Digital Millennium Copyright Act (DMCA) with their host. Most hosting companies have forms for DMCA complaints.
setting up an RSS feed and customise the footer to include a disclaimer such as “[article heading] was published on [your website name or URL] which is not allowed to be copied on other sites”.
- implement ‘canonical tags’ which protect site owners from unintentional duplications caused by their own website.
3. Hiring a content farm to mass produce negative reviews on Google for Business and review sites. Since most search engines rely on user-generated content as one of many factors to determine site ranking, this practice hurts businesses that are on a prospects radar doing the consideration phase of the buyer decision journey.
The fix: Marketers must work towards self-preservation.
According to Athena Bughao, senior media activation director for search and biddable platforms at Essence, marketers must invest in tools that enable monitoring and prevention is key here.
"There are numerous online tools that allow us to keep track of backlinks, content duplication, scraping and negative reviews - both free & paid," said Bughao. "Tools like Copyscape is a great free tool that shows you if your content is appearing in other domains as a free plagiarism checker - it detects duplicates for you."
She adds that nothing beats authentic customer service and high-quality experience, which pro-actively optimises a brand positively.
"High brand affinity against audiences, favourable reviews and feedback will be able to nullify negative exposure," said Bughao. "If your customer service and the product is truly above par, in working with brands, there have been cases where clients themselves are the advocates of the brand which makes for best case scenario."
4. Competitors bidding on brand terms, whereby brand A will bid on the branded keyword that is a registered trademark of brand B either because brand B has a stronger mindshare or because doing so will trick the active seeker to visit the site of brand A.
The fix: Simply email the competing brand and ask them to stop.
That's the advice of Danish Ayub, CEO of MWM Studioz, adding that brand marketers can always file a trademark complaint with Google or work to improve quality score while bidding on the same branded keywords.
"A good quality score lowers cost of campaigns and higher rankings," said Ayub. "In the digital era, that's what every marketer wants at scale."
5. Draining PPC campaigns by programming bots to click on ads. This is often done to send fake traffic to sites while compromising the integrity of paid search.
The fix: Identify the source of fraudulent traffic and reassess segmentation & targeting.
Ayub recommends that marketers turn to Google Analytics 360 premium to understand the channels, source, medium, and geography of the bot clicks, turning the tables to cut off that pipeline of segmentation.
"Exclude certain countries and languages to see your click fraud drop significantly," said Ayub. "Test and learn, removing one flagged country at a time and tracking your end goals."
Bughao says that platforms like Adwords are now quite sensitive to bot traffic and have shown to reimburse for invalid clicks.
"Ensure that your website is secure with authenticators and website log tracking that will help you justify additional costs should you need to prove to online platforms that you are being hacked or subject to black hat SEO," she said.
All respondents agreed that prevention is better than cure and that proactive monitoring and self-preservation were key.
"We lock our doors, install cameras in our homes and passwords on accounts we know are valuable to us, the same should be applied to our online brands," said Bughao.
Interested in more Tip Sheets? Click here.