Podcast: Play in new window | Download
Subscribe: Spotify | RSS | More
Nagu Rangan with Jason Barnard at SMX London 2019
Nagu Rangan talks with Jason Barnard about how Bing ranking works.
First question. Very obvious “What are the ranking factors”. He tells me that ranking factors are just a means to an end. Then Nagu tells me that relevancy is Bing’s North Star. There is no point in distinguishing between ranking factors and ranking signals. I ask what the ranking factors are (again) and Nagu politely that he gets leery when answering on the details. Tells me that obsessing on individual ranking factors is a futile excercise since the factors and their weightings are constantly changing. He guides me to the three considerations that a search engine to optimise user satisfaction. Relevancy, Quality and Context. We move onto what is the optimal keyword density. understanding without Deep Learning and Neural Networks (that Bing has been using since 2006). I manage to bully Nagu into agreeing that that everything we do as SEOs should serve Understanding, Deliverability and Credibility. I tell Nagu my theory about how the topic layer might work and he tells me my theory is a great product idea. Then a chat about the weather which sounds boring but isn’t. Does Bing have the same bidding system as Google for rich elements. No clear answer there but Nagu hints that it is a similar system. Bing is getting jiggy with real time information so freshness and recency is really important to Bing and is a driver of rich elements too. Theresa May gets a mention – because when she was 5, “Bing wasn’t a Thing”.
Transcript for How Bing Ranking Works (Nagu Rangan with Jason Barnard)
Jason Barnard: Welcome to the show.
Nagu Rangan: Thank you. Great to be here.
Jason Barnard: Now, I just met you walking out of the conference, SMX London. You’re a guy from Microsoft, and you are product manager — or program manager — of ranking, which is incredibly exciting. At Bing. And you were talking about ranking and relevancy, and we started a discussion. What I’m interested in: what are the ranking factors? There’s a big smile on your face there.
Nagu Rangan: Yeah.
Jason Barnard: You’ve heard that question a few times, I take it.
Nagu Rangan: Yes.
Jason Barnard: And how far can you answer it?
Nagu Rangan: I can answer the question. Ranking factors are not the end. They’re the means to the end.
Jason Barnard: Brilliant. I like that.
Nagu Rangan: We talked about this offline before getting onto the podcast. The way we think about the goal of building a search engine — our northstar is user satisfaction.
Jason Barnard: “Northstar” — is that official Bing terminology?
Nagu Rangan: Yeah. So our northstar is user satisfaction. The way to understand it is to provide timely, relevant results that users can trust. That’s what we’re trying to do. And ranking factors — there are several hundreds of them, and they help us determine the results we should be showing to users.
Jason Barnard: Is there any point in differentiating ranking signals and ranking factors? I’m starting to think ranking factors are these big clumps of ranking signals.
Nagu Rangan: I kind of use them interchangeably. It’s a set of signals, or a combination of signals, used for determining ranking.
Jason Barnard: So the distinction isn’t really—
Nagu Rangan: In my mind they’re interchangeable.
Jason Barnard: And you’re the king of ranking algorithms, so your mind is—
Nagu Rangan: Quite the king — but there is an entire team. I’m just representing the work of Bing at SMX.
Jason Barnard: Brilliant. So, ranking signals — give me some examples.
Nagu Rangan: Making sure your site is indexable — that’s a ranking factor. Making sure there’s clear content. Making sure the site’s marked up, all of that.
Jason Barnard: That wasn’t a very fair question. You were talking about relevancy. I’m really interested in relevancy — I missed your talk this morning and I’m very sorry.
Nagu Rangan: No worries.
Jason Barnard: Go ahead.
Nagu Rangan: Let’s talk about the aspects of user satisfaction, which is our northstar goal. There are essentially three factors. One is relevance — is the result on topic? Say I want to learn about, or visit, the Tower of London, and I search for “Tower of London.” I want to see a result about it.
Jason Barnard: Not Tower of Pisa—
Nagu Rangan: Or Tower Hamlets, London.
Jason Barnard: Or the Taj Mahal. Right.
Nagu Rangan: So that’s essentially what relevance is. The second one is quality — can the user trust this result? One of the signals is inbound links, but there are several.
Jason Barnard: Inbound links — you’ve got your equivalent of PageRank.
Nagu Rangan: Yeah. As I mentioned, there are several hundreds of signals that work together in this context.
Jason Barnard: Yes. I’m not going to get much more out of you, am I? Go ahead — the third one.
Nagu Rangan: It’s more around how to think about it. Ranking signals are a means to an end, and I always get a little bit leery when—
Jason Barnard: That’s a great word.
Nagu Rangan: —when SEOs and digital marketers start focusing on… let me give you an example: what’s the right keyword density?
Jason Barnard: Yeah, yeah, yeah.
Nagu Rangan: Sure, keywords are important. If you’re talking about the Tower of London, it’s important to have the words “Tower of London.” But we’re making quite a bit of progress there. For example, we just open-sourced one of our deep learning libraries, which helps you do something pretty sophisticated. If the query is “the height of the famous tower in Paris” — note I never said Eiffel—
Jason Barnard: Mm-hm.
Nagu Rangan: —we’re still able to return the result about the Eiffel Tower, because we’re able to get a much deeper understanding of what the query is saying. I’m deviating a bit, but the point is: how do we really understand what the user is trying to do?
Jason Barnard: And relevancy requires us to understand what they’re trying to do — the knowledge graph, understanding the intention of the person, what they’re looking for, even if they don’t say the word “Eiffel.” “Eiffel” is going to be on the page anyway.
Nagu Rangan: “Eiffel” is implied in what the user is trying to do, and with deep learning we’re able to understand that in a way we weren’t able to before. Before the age of deep learning. Just for context, Bing has been using neural nets since around 2006 — not deep neural networks, those came a bit later.
Jason Barnard: The distinction is important; I just don’t understand it. So — keywords. We talked about this scenario where a particular keyword isn’t even in the query.
Nagu Rangan: Even then we’re able to understand what the user is implying and surface the right result. I get asked often: what’s the optimal keyword density? People consider it a key ranking factor and obsess about it. And that’s what I’m trying to address — obsessing over one single ranking factor is not really the way to think about this, because ranking factors change all the time.
Jason Barnard: Brilliant.
Nagu Rangan: New ones are added, and the level of importance each one has—
Jason Barnard: —is changing constantly.
Nagu Rangan: Constantly. And it changes depending on the query. So it’s — maybe I shouldn’t call it this, but — it’s a bit of a futile exercise to over-obsess on keyword density, or any one single ranking factor.
Jason Barnard: That brings me back — you saw my talk, and I was saying whatever you do has to serve understanding, credibility, or deliverability. Everything fits into that trio. And you’re kind of saying the same thing. It was Brent C. Payne who said it’s better to be a straight-C student than get three A’s and an F — if you concentrate on one or two things and completely ignore another, you’re going to get tanked.
Nagu Rangan: Yeah. User satisfaction has essentially three components that influence it.
Jason Barnard: We’ve done the first—
Nagu Rangan: Quality, and context. Context is essentially: I live in Seattle, I’m visiting London, and this morning when I woke up at the hotel I searched for “coffee.” It doesn’t say, “You’re from Seattle, so here are coffee shops near where you live.” It knows I’m in London and surfaces results near the hotel.
Jason Barnard: Nice, simple example of context.
Nagu Rangan: In this case I’m talking specifically about location.
Jason Barnard: Can I ask about another dimension of context I’m really interested in? Google was talking about the topic layer on the knowledge graph. I have this theory: I’m a musician, I can play the double bass. If I’m clicking on links to more advanced double bass topics, would it not be logical to stop showing me the simple ones? Can the engine understand where I am in my journey? If I start learning piano, I search “lessons for piano,” I go to the first one — over time, should the results not adapt to my level of expertise?
Nagu Rangan: Great product idea.
Jason Barnard: I’ll give it to you. Go ahead.
Nagu Rangan: There are queries where people are researching a topic — home renovation, for example — where it spans not just one query over an hour, but several days. There are interesting things we could do to address those queries.
Jason Barnard: What I’m suggesting is phenomenally complicated and isn’t going to happen tomorrow.
Nagu Rangan: Well, one of the amazing things is: since we’re anchored on user satisfaction, we’re constantly improving our understanding of the pain points. We’ve come a long way. Say someone wanted the weather forecast twenty years ago — you had to wait for a news programme that aired only at certain times of day. Today you go to Bing, search for “weather,” and we give you the forecast plus information like the pollen level in the air. We’ve come a long way on that one query, and the point is we’re not done — we’re continuing to add new features and innovating on behalf of the user.
Jason Barnard: And the whole thing is moving faster and faster.
Nagu Rangan: Every single day on ranking, we’re working on new innovations.
Jason Barnard: Another question related to my talk: the rich elements. You’ve got them too — answer boxes, images, and so on. I was arguing that rich elements outbid the standard blue links. Is that how Bing works — inserting these elements on top of the ten blue links?
Nagu Rangan: The search results page has evolved and continues to evolve. If 5-7 years ago you wanted information about a current event that happened an hour ago, search engines weren’t good at handling that. Now we’re able to recognise: for this query, something is happening right now, so we should show that result alongside everything else.
Jason Barnard: So freshness and recency is a really good way to get your information into Bing.
Nagu Rangan: Yeah. If you’re searching for Theresa May, people are probably interested in what she’s saying now, not where she was born or who she is.
Jason Barnard: Much more likely they’re interested in what she’s saying today than what she was saying when she was five.
Nagu Rangan: So when you search for her, there’s a whole bunch of elements: news articles about what she said, trending items, pictures of her, a quick snippet about who she is, probably an entity panel. The SERP is trying to address the user’s needs, composed of all these elements.
Jason Barnard: The Theresa May example is interesting because it’s very news-focused, very freshness-focused. The argument I was making about rich elements outbidding the blue links — for her it’s obvious because of the constant stream of news. For me it’s less so, because I don’t have that rate of news associated with my name. Is that correct?
Nagu Rangan: Yeah. In a few days there’ll be the cricket World Cup.
Jason Barnard: You like cricket?
Nagu Rangan: I love cricket. It starts on the 31st of May. And I won’t be able to go to any of the games — I’m going back to Seattle.
Jason Barnard: So you’ll have to watch the baseball.
Nagu Rangan: Yes. During that period there’ll be a lot of interest in cricket, a lot of news articles. If someone searches a cricket-related query, they’re most likely to find a news article. After the World Cup concludes, 3-4 months later, the same query — say a team’s name — returns different results, because the user’s intent changes over time. So the SERP changes accordingly.
Jason Barnard: So we have these rich elements coming and going, and everything is constantly changing. That’s another problem with obsessing over ranking factors — next week it’ll be different, because the moment will have changed.
Nagu Rangan: Yes.
Jason Barnard: Brilliant. I love that. Thank you very much, Nagu Rangan.
Nagu Rangan: Thank you.
