Sometimes links don’t return the image, title, or description you expect. This article will cover the troubleshooting steps necessary to create social media-friendly content cards.
Error message or unexpected results
Every time you create a Link-type Post or Share, the webpage is scraped to determine what image, title, and description will appear in the content card.
If you don’t see the link preview you expected, or you’re encountering an error similar to the one below, this article will guide you through the most common scenarios and how you can solve them.
The quickest test is to compare the results from Facebook, Twitter, LinkedIn, and EveryoneSocial. Since we all use the same industry-standard technology to scrape metadata from links, testing other social media sites can help determine where the problem lies.
Before you begin, make sure you have the hyperlink in question handy. Next, find a link that works well for sharing on, say, Twitter. Articles from major news outlets work well for this purpose. You’ll reuse this link as the ‘known-good link“ in testing the comparisons below.
1️⃣ Paste the known-good link into the Meta Sharing Debugger and select Fetch New Information if prompted. Note the results. Then, repeat this step with the link in question.
2️⃣ Paste the known-good link into the Twitter Card Validator and note the results. Then, repeat this step with the link in question.
3️⃣ Paste the known-good link into the LinkedIn Post Inspector and note the results. Then, repeat this step with the link in question.
4️⃣ Inside EveryoneSocial, select Compose > Post > Link and paste a link from a known-good source. Note the results. Then, repeat this step with the link in question.
Understanding the results
If Facebook, Twitter, LinkedIn, and EveryoneSocial successfully created a content card, then no worries. All is well! But if that’s not the case, keep reading!
If any of those social media platforms fail to create a content card—entirely or partially—the odds are overwhelmingly high that EveryoneSocial shared a similar result since all four platforms share the same technology to create content cards.
Addressing the problem
For social media platforms to successfully scrape a webpage, a few key requirements must be fulfilled. We’ve documented those requirements below and provided some handy tips to share with the web developer who maintains the webpage(s) in question.
1️⃣ The webpage must apply OG tags to the image, title, and description.
Facebook: A Guide to Sharing for Webmasters
Twitter: Overview of all Twitter Card Tags
freeCodeCamp: What is Open Graph and how can I use it for my website?
2️⃣ The webpage must allow specific crawler bots to view the page contents.
Crawler bots, often called “user-agents,” do the scraping of the website to find the OG tags, image, title, and description. If these user-agents are blocked (usually in the robots.txt file), you’ll run into trouble on social media platforms.
Ensure that EveryoneSocial’s user-agent is allowed. We recommend allowing other social media platforms to your allowed user-agent list, too.
EveryoneSocial User-Agent: EveryoneSocialBot
Twitter User-Agent: More information here
LinkedIn User-Agent: LinkedInBot (version numbers and information will change)
Facebook User-Agent: More information here
In addition to our internal crawler bot,
EveryoneSocialBot, we also have two failover options in case our crawler is blocked. While it’s not required, we recommend allowing these user-agents access to your webpage. Version numbers will change over time.
Mozilla/5.0 (compatible; Embedly/0.2; +http://support.embed.ly/)
Have more questions for us?
If your URL is populating correctly on Twitter, Facebook, and LinkedIn, but it does not appear in EveryoneSocial, please reach out to our Support team so we can investigate further!