Bing Jail

A short story on how my website lost all indexed pages in Bing, affecting my presence in DuckDuckGo, secret blacklisting and classification as a splog by Bing AI

My personal site has been recently penalized by Bing, or if you prefer different naming for it - secretly blacklisted or shadowbanned. Don’t know exactly why but by the end of January 2023 I lost every indexed page that had been in Bing.

My website Bing Search Performance over last 6 months

Why do I even bother, I asked, when I noticed that trend in February, so I left it for time being.

I do not use Bing, I don’t like Bing, however, I use some other search engines that rely on Bing like DuckDuckGo Search.

Because of that, when you use the search on my website, if the result were not as expected I suggested a link to search it through DuckDuckGo. Until the issue is sorted I have been forced to change it back to Google.

Firstly, don’t bother contacting Bing Support. This support is simply non-existent. Nobody will reply to your emails, and nobody will care unless you are heavy paying customer.

Hey Bing Team. If you want to follow up, my most recent request is ‌REQ00056361.

There is plenty of forum posts with similar experiences like the recent one in which I decided to participate.

When I don’t care too much about Bing Search I am angry because of my lack of site presence on DuckDuckGo.

I started reading various things, and yes, I went through Bing Webmaster Guidelines. If you know my site a bit more you may see that is perfectly optimised and all requirements have been met so that wasn’t the issue.

Just been thinking that the content of my website is what somebody in Microsoft disliked and decided to ban me behind the scene without acknowledging it.

It looks like Microsoft is doing some shady secret blacklisting which, if happens to some company with a budget to spare, would finish in a lawsuit. With a niche website like mine, they can sleep without worrying and the support can keep ignoring me.

I got some articles about Microsoft and Windows but all are legit and none of them relates to piracy. So why?

The question is Why have I been thinking that somebody there does not like what I am writing and secretly blacklist my website? It’s because none of my other websites has been affected.

All other websites have been built with the same webmaster and SEO practices and techniques, the only difference was content, so yes, I am thinking that Bing and Microsoft itself doing some shady job behind scenes and discriminating websites through their search engines.

I would be very happy to have an open discussion on the reasons, but when you don’t know the real reasons behind it and you don’t have anybody to contact/speak with, that’s just making you angry a bit more.

I decided to ask the Bing Chat bot “Why my website is not indexed in Bing?” and here is what I get.


“Your site is new and Bing hasn’t discovered and crawled it yet. With new websites, it can take Bing time to find links and crawl through to your website.”

No, that’s not the case.

“Bing is running into issues crawling the site. If Bing consistently cannot crawl your website, it may not select it to be indexed.”

No crawling issues were reported, nada.

“No links are pointing to your website. A common problem for new websites is the lack of links pointing to them from other places on the web. Such links help Bing discover your content and tell us about the popularity of your web pages.”

Google know how many links are there and how popular the website is. When importing my website into Bing Webmaster Tools I used direct import from Google Search Console hence that shall confirm the thing that indeed there are links and some are quite popular.

“You can check if any of these issues apply to your website and take appropriate actions to resolve them. You can also use Bing Webmaster Tools to get more insights into why your website is not being indexed.”

It will be nice if I will see any issues. The text “use Bing Webmaster Tools to get more insights into why your website is not being indexed” just angers me even more. What insights!?

I am not alone, there is plenty of users reporting the same problems without any support from Microsoft and the Bing team.


You may ask what issues are reported when trying to do URL Inspection from Webmaster Tools. The information is very vague.

Bing Webmaster Tools - URL Inspection through Bing Index

And that’s through Bing Index where Live URL doesn’t see anything wrong.

Bing Webmaster Tools - URL Inspection through Live URL

I, of course, clicked the Request indexing button, the same as I am doing when I am publishing any new content.

I even removed my website from Webmaster Tools and re-added it back again, on 21st April, but for the effect on that, I will need to wait (if there will be any).

An interesting fact is that my website is really in a subdomain (dariusz.) of my main domain (wieckiewicz.org), very similar to my daughter (anna.) whose website was not penalized by Bing.

When I started messing with Webmaster Tools by removing my website and re-adding it again, I reminded myself that a couple of years ago Google tend to suggest adding to their Search Console any domains and variations that we got control of. So on 22nd April 2023, I added the main domain (wieckiewicz.org) as well. Through this, I added sitemaps to my website (in the subdomain) and my daughter’s to see if anything will change. It will be nonsense if I will but let’s try.

At the stage when I added the domain and verified ownership through the CNAME DNS setting on the homepage of Bing Webmaster Tools I noticed a chart for Search Performance. As this (in theory) is a newly added site there is no graph showing, however, there were some figures on top of it that makes no sense and I started thinking.

Bing Search Performance for domain wieckiewicz.org

The headings of the flat graph showed Indexed pages at the value of 1.8K. That’s a bit nonsense, as firstly, there is no website on my main domain and my daughter got only 123 pages. This starts looking like the figures from my website. When started playing with the date range switch noticed that these figures are most likely for indexed pages that were months ago, hence not very accurate.

I decided to wait a couple of days to see if the graph will update and if anything changed.

If in fact, something changed in Bing requirements around January (we know about their core update in mid-Jan.), it would be nice if their Webmaster guidelines would at least mention this somehow.

I keep investigating others’ experiences in that matter, like Dave, who shared a couple of posts about this issue.

Dave took some steps to help him back on track with Bing, which I also followed before I even discovered his website.

He solved his issue. I decided to follow his lead and complain publically and blog about it as well, hence this text here.


I learned new terminology today, which was “splog”. Somehow Bing robots most likely classed my site like that which means “Spam Blog”. Total nonsense!

Seriously, I would appreciate it if somebody at Bing would go through my website and explain to me how I am called a Spam Blog.

Apart from sending a most recent request (REQ00056361) to Bing Webmaster Support Team, there is nothing that I can do about it.


Dave mentioned some crawling issues and SEO errors reported and I decided to look at mines as well. I run a full Site Scan from under SEO menu to see what this will bring.

What it brings was nothing to do with being classed as “splog”, but still decided to go through them.

Analysing Site Scan results

Bing Site Scan Completed

  • Total pages scanned: 2.6K
  • Errors: 11
  • Warnings: 277

Bing Site Scan Completed - Issue details

ERROR: Blocked by robots.txt

I intentionally added one block rule to my robots.txt file to prevent indexing links to my download files so people will get them through posts on my website and will understand more what is for than blindly downloading it from a search engine not knowing the context behind it.

There is nothing to fix here, Bing may think differently but it’s all intentional.

ERROR: Meta Description tag missing

Just two pages here with missing meta descriptions. There may be more, but concentrating on these two is an easy fix which I did straight away.

WARNING: Title too long

No is not!

My titles, if they are long, are descriptive enough and in some languages they need to be, as in other cases they will just sound like clickbait titles.

I don’t get why Bing still follows this outdated recommendation for a title length of 70 characters. This is total nonsense. Google and its guys confirmed over the last months that there is no limit for them for titles. If the titles are too long to display on a specified device they will simply truncate them to fit.

So… Bing… truncate the titles and STOP with this nonsense recommendation!

Nothing to fix here.

WARNING: Meta Description too long or too short

239 pages. I get it, this can be fixed. It requires some work but can be done.

Off course any warnings related to an actual page I will look to fix them (add meta description), but a recommendation to fix the meta description for a specified /tag/ is a bit excessive so seriously Bing?

Luckily to the templating in Hugo I have been able to add meta descriptions to all tags with minimum work from myself.

WARNING: HTML size is too long

“Evaluated size of HTML is estimated to be over 125 KB and risks not being fully cached. Search engines may not fully acquire the content on a page if the page contains a lot of code. Extraneous code can push the content down in the page source making it harder for a search engine crawler to get to it. A soft limit of 125 KB is used for guidance to ensure all content & links are available in the page source to be cached by the crawler. This means that if the page size is too big, a search engine may not be able to get all of the content or may end up not fully caching it.”

Ok, this is something that caught my attention.

As Hugo is used to generate my website, I am not using any extensive additions so the pages shall be light and fast. The fact that the website shows above 125kB requires me to look a little bit deeper.

After a short investigation, the highlighted posts have plenty of images and because of that the <picture> tags included in the code, which contain a reference to various sizes of the image in the WebP format, cause the pure HTML file to weight a bit more than Bing soft limit.

None of the other search engines complained about that and the code is served through Netlify CDN hence I decided, weighing the pros and cons, to leave it unchanged.

Bing, you need to deal with it that this is how it will be.

WARNING: Meta robots tag contains restrictive robots directives

Yes, intentionally, forget about it and move along. Just a warning that shall not be considered a problem and definitely shall not prevent the site from indexing.

WARNING: Meta Refresh tag exists

Yes, once again, it’s intentional, forget about it and move along.

NOTICE: H1 tag missing

Nothing to fix here. Bing reported issues with files that are just meta redirects (aliases) to other pages. All other pages on my site are following this most basic SEO principle.


Now I will leave it and see if over the next weeks, months, or never, it will change. I am not expecting anybody from Bing to contact me, but if that happens and I will get any positive output will update this post.

Nicholas A. Ferrell, the editor of The New Leaf Journal, which is still suffering from Bing Jail, has created a helpful collection of articles about Bing Jail in his Not-So-Awesome Bing Search Bans and De-Indexing GitHub repository. It is worth taking a look.

Comments
Categories