Sometimes links don’t return the image, title, or description you expect. This article will cover the troubleshooting steps necessary to create social media-friendly content cards.
Error message or unexpected results
Every time you create a Link-type Post or Share, the webpage is scraped to determine what image, title, and description will appear in the content card.
If you don’t see the link preview you expected, or you’re encountering an error similar to the one below, this article will guide you through the most common scenarios and how you can solve them.
The quickest test is to compare the results from Facebook, Twitter, LinkedIn, and EveryoneSocial. Testing other social media sites can help determine where the problem lies because we all use the same industry-standard technology to scrape metadata from links.
You'll want to have a working link for the test and the link that's giving the error. Articles from major news outlets are reputable sources to find a "working link". You’ll use the working link in the test comparisons below.
1️⃣ Paste the working link into the Meta Sharing Debugger and select Fetch New Information if prompted. Note the results. Then, repeat this step with the link in question.
2️⃣ Paste the working link into the Twitter Card Validator and note the results. Then, repeat this step with the link in question.
3️⃣ Paste the working link into the LinkedIn Post Inspector and note the results. Then, repeat this step with the link in question.
4️⃣ Inside EveryoneSocial, select Compose > Post > Link and paste the working link. Note the results. Then, repeat this step with the link in question.
Understanding the results
If Facebook, Twitter, LinkedIn, and EveryoneSocial successfully created a content card, then no worries. All is well! But if that’s not the case, keep reading!
If any of those social media platforms fail to create a content card—entirely or partially—the odds are overwhelmingly high that EveryoneSocial shared a similar result since all four platforms share the same technology to create content cards.
Addressing the problem
For social media platforms to successfully scrape a webpage, a few key requirements must be fulfilled. We’ve documented those requirements below and provided some handy tips to share with the web developer who maintains the webpage(s) in question.
1️⃣ The webpage must apply OG tags to the image, title, and description.
-
Facebook: A Guide to Sharing for Webmasters
-
Twitter: Overview of all Twitter Card Tags
-
freeCodeCamp: What is Open Graph and how can I use it for my website?
2️⃣ The webpage must allow specific crawler bots to view the page contents.
Crawler bots, often called “user-agents,” do the scraping of the website to find the OG tags, image, title, and description. If these user-agents are blocked (usually in the robots.txt file), you’ll run into trouble on social media platforms.
Ensure that EveryoneSocial’s user-agent is allowed. We recommend allowing other social media platforms to your allowed user-agent list, too.
-
EveryoneSocial User-Agent: EveryoneSocialBot
-
Twitter User-Agent: More information here
-
LinkedIn User-Agent: LinkedInBot (version numbers and information will change)
-
Facebook User-Agent: More information here
In addition to our internal crawler bot, EveryoneSocialBot
, we also have two failover options in case our crawler is blocked. While it’s not required, we recommend allowing these user-agents access to your webpage. Version numbers will change over time.
-
Embedly:
Mozilla/5.0 (compatible; Embedly/0.2; +http://support.embed.ly/)
-
Iframely:
Iframely/1.3.1 (+https://iframely.com/docs/about)
Have more questions for us?
If your URL is populating correctly on Twitter, Facebook, and LinkedIn, but it does not appear in EveryoneSocial, please reach out to our Support team so we can investigate further!
Comments
0 comments
Article is closed for comments.