Siadhal Magos

Metaview CEO & Co-Founder Siadhal Magos

Siadhal MagosCEO & Co-Founder

Today on Talk Talent To Me, we are joined by the founder of Metaview, Siadhal Magos. Metaview trains their clients to become better interviewers through providing transcripts of the interviews and giving feedback on their skills. Tuning in, you’ll hear some examples of the feedback they will give, their ‘gold standard’ of interviewing, why the average employee is not very good at conducting interviews, and how Metaview helps their clients identify their best interviewers and use them as an example for other employees. We also discuss some unique ways clients have used Metaview’s tools and why the highest performing employees aren’t always the best interviewers. Next, look into why employees should run the same interview and how to prevent burnout. Lastly, Siadhal gives us some tips on what leaders can do to improve their interviewing process. Join us to find out more!

Episode Transcript

[INTRODUCTION]

[0:00:06.1] RS: Welcome to Talk Talent to Me, a podcast featuring the most elite talent leaders on the frontline’s modern recruitment.

[0:00:12.8] FEMALE: We actually want to understand the themes of someone’s life, we want to understand how they make decisions. Where are they willing to take risks and what it looks like when they fail.

[0:00:22.7] RS: No holds barred, completely off-the-cuff interviews with directors of recruitment, VPs of global talent, CHROs, and everyone in between.

[0:00:31.1] FEMALE: Once I went through the classes and the trainings got the certifications through diversity and inclusion, I still felt like something was missing.

[0:00:39.7] MALE: Talent acquisition, it a fantastic career, you are trusted by the organization, you get to work with the C-Suite and the security at the front desk and everybody in between and everybody knows you.

[0:00:53.0] RS: I’m your host, Rob Stevenson and you’re about to hear the best in the biz, Talk Talent to Me.

[INTERVIEW]

[0:00:59.9] RS: Here with me today on Talk Talent to Me is the CEO and co-founder of Metaview, Siadhal Magos. Siadhal, welcome to the podcast, how are you today?

[0:01:08.1] SM: I am very good, thank you Rob, how are you doing?

[0:01:09.8] RS: I’m doing really well. I am about to set sail on the TT2M Roadshow, we’ve got shows in San Francisco this week and then London the following week so I’m cramming three weeks of work into two days here because I don’t think I’m going to get much done on the road because I’m also going on vacation after that.

So just to – just cranking, a little bit hectic over here in the basement beat lab but no, that’s never not hectic I guess.

[0:01:33.1] SM: Sounds like you’re embracing the post-COVID life or post and funny as I guess.

[0:01:37.6] RS: Yeah, post in terms of I’m vaccinated, boosted and I can go out into the world without fear or with only a marginal amount of fear but yeah, it’s good. I mean, the UK doesn’t require anything of you really. They don’t even require a vaccination status to arrive. The US wants you to have a negative test when you come back but all is fair in love and war as far as UK is concerned but glad to have you. You are yourself in the UK, is that correct?

[0:02:00.9] SM: Yeah, that’s right, based in London most of the time.

[0:02:03.1] RS: Got it. Well, so glad to have you. Siadhal, there’s so much I want to speak with you about because this company you founded, Metaview has a really awesome offering. Rather than just like stumble my way through explaining what your company does and have you crutch me, would you mind sharing with the folks at home a little bit about what Metaview does and maybe we can get some background into your experience and how you wound up founding the company?

[0:02:22.6] SM: Absolutely. So Metaview is the leading platform for essentially driving the quality of interviews at ambitious companies. So our founding thought really is that, when you’re hiring at scale, it’s really hard, frankly impossible, to keep interviews at a super high quality and consistent.

Just because you grow, they’re just so many people involved in so many different interviews, there’s so many different roles. I’ve got so many different locations potentially as well that it’s frankly just impossible to make sure they are consistent and well-calibrated and that basically means you end up missing out on top talent which is really the name of the game, right?

Hire amazing people to help you at your objectives. So when you miss great people or you hire the wrong people, that’s really painful and really damaging for business velocity. So what we aim to do is, fundamentally give our customers unprecedented visibility into what it’s actually happening in their interviews.

We do that by recording and transcribing and generating unique insights about what’s happening in those conversations and then layering on a sort of a chunk of training flows to help interviewers, who are the people, you know, obviously, they’re where the rubber meets the road, right? They’re the people who are actually building you a team view so actually help interviewers run a better interview. So yeah, that’s what we’re about.

[0:03:32.3] RS: So, if I were to conduct an interview using this tool, what would be some examples of some feedback that might give me after I’d finished?

[0:03:38.8] SM: Feedback is a part of what Metaview does, it’s not only about feedback but it’s a really important part of what we do. There’s also the fact that you have the transcript, which is used a ton by our users by our interviewers. You know, just in a scenario of maybe where they’ve had back-to-back meetings all day and the interview was first thing in the morning and they don’t really remember what we talked about in our conversation, all they have is that foggy memory, maybe about how they felt about the candidate.

Then you of course have the transcripts to go back to, which is super useful and then, there’s a second layer above that, which is the objective data which you as an interviewer would get about the interview so you’d know what percentage of interview you were speaking as supposed to the candidate and how does that compare to some of your previous interviews or other interviews that have been conducted in the company or even other interviews conducted on Metaview’s platform among other companies.

Other things like how many questions you asked and how does that compare, how consistent you’re being there are some of the sort of projected measures that we give interviewers, which is a really great way for interviews to sort of perplex and so think if there’s someone on the bait, would want to do differently the next time.

Then there is this final layer, this what we call coaching layer, whereby Metaview actually provides proactive suggestions and what to change. So in a scenario for example where you maybe ask questions that are result in very short answers from the candidates. Ways that you can rephrase those questions that are more likely to elicit a descriptive answer for the candidate and that’s the sort of feedback I guess that most people have in mind when they think about our platform.

[0:05:00.2] RS: Even just a transcription part, it sounds so obvious when you say it, why have we not been doing this all along because like you say, even just a record of what you spoke about, the alternative is what most people contend with. Just people putting their notes into the ATS, hopefully right away, but usually not and it usually starts with, “I had a great conversation with so and so.”

Cool, we can’t use that to evaluate them at all so having the actual record of what you spoke about is super important and also, like side note, nothing is quite as humbling to someone who thinks that they present their ideas well than seeing your speech in a transcript. You don’t realize how many times you start sentences over, how many times you use, “like, umm, you know” I think that’s just for anyone who wants to present themselves better really, really important.

[0:05:45.6] SM: I couldn’t agree more and you’re making me worried now about this podcast recording and how many times I say, “like and um” but we’ll see, I guess. What I would say about your question is, “Hey, why isn’t this happening? Why has it been this way for the longest time?” is because the world’s changed, right? It’s only actually in the last – definitely since in the last couple of years, the world has fundamentally changed in that we’re all almost having all of our business meetings talking into microphones, talking into cameras.

So suddenly, it’s just so much easier to start to capture these conversations and that’s the fundamental shift. So there’s definitely technology advances that have enabled Metaview to make sense, speech to text is pretty darn good now. You get highly accurate transcripts automatically and very cheaply but there’s also just the societal shift where as I said, every conversation is now had within arm’s reach of a microphone and a camera.

That’s going to have very profound impacts on the way organizations run themselves and this is one of the ways that data is going to be hacked and used to help organizations hire more effectively and that’s where we’re leading the charge on.

[0:06:44.4] RS: That’s such an important callout because there’s a company called Gong that does something similar but it’s for sales calls and it’s the idea as that we can help your sales people do better by recording their calls but of course, a lot of sales calls have been happening over Zoom for a long time, right? It’s like, I’m pleased you pointed that out though, that only recently have we been doing these interviews remotely so now is the time.

I’m curious how you train the technology and what I mean by that is, what is the gold standard of interviewing that you sort of use to lay the technology on so it can say, “Okay, here are the areas where you can improve.”

[0:07:22.4] SM: There are two ways that we look at this. One is almost custom agnostic so every customer sees the same and then another is very customer specific. So in the customer agnostic base, what Metaview is great at is identifying clearly a normal layer called bad interview behavior. So that’s things like interviews who consistently dominate conversations with cameras, don’t actually let them get a word or hedge ways or are consistently late for interviews or ask very few questions.

Now, the way we actually execute that inside and pass that onto the interviewer is, you know, we’re on the side of the interviewer here. We’re trying to – often these people aren’t aware and actually just getting the insight helps them change their behavior. It is not going to – it is not a big rubber tool. It’s a coaching and nudging part of the products but essentially, we have series of benchmarks that we did. We’ve honed it over tens of thousands of interviews we’re capturing every month on the platform to identify in resolute broad terms, what pad-pen look like and what should therefore change.

So, that’s sort of bucket number one of the gold standard going into your Metaview and it’s very sympathetically passed on to the interviewer themselves but they’re relatively brought on and soluble path looks like. The second and more important part of this is the customer specific lens what a good interview looks like and in that scenario, the reality is that a lot of these companies have folks in them who understand who their best interviewers are.

Either that’s because they’re the most experienced people in the company and have been so central to building the great team they currently have and you know, they want to almost clone that person and wish they could have every hiring manager that’s effective as they were and once you start to capture their interviews, you thought had build up a vision of what a great interview looks like for that type of interview within this company.

Now, what we don’t want is for everyone who adopts Metaview to end up interviewing exactly the same and it’s not a competitive advantage for anyone. Actually, what you want people to do is use this ability, use this data and the fundamental game take, the actual recordings and transcripts to coach more of their interviewers to be just like what they consider their best interviewers.

So at that level, it’s very customer specific and actually can’t give you a sort of a Metaview answer to, “Well, this is what a great interview looks like when you’re hiring engineering managers and you’re trying to run a leadership interview” there is no generic answer to that. Every organization should apply their own judgment to that based on the type of people they are trying to hire and Metaview enables them to do that by giving them the data but also just giving them advantage.

[0:09:41.9] RS: I’m going to say something that you probably wouldn’t say but you might quietly agree with, which is that your average employee is probably a pretty poor interviewer. I think, it’s a very clunky sort of experience that even when you train people on, they still have to go take those things into the room, people are prideful about it, they’re like, “Oh, I’m just going to have a conversation. I’ll just go in there without any background and it will be fine” which that approach a good interview does not make.

[0:10:09.3] SM: Yeah. I actually do agree with you and I don’t think it’s controversial by the way. I think, from whenever we speak to – we spend at on of time speaking with interviewers, again, these are the people who we – a key metric for us is to interview and tell us that Metaview is helping them and does it help and feel more confident and we have various feedback loops to get that information back into our product and yeah, it’s a KPI for us.

Most of them will agree that they’re not actually so sure about what a great interview looks like. Now, what people do think that they are good at is judging. They think everyone thinks they’re a good judge of people but that’s not the same thing as thinking, “I actually know how to elicit the signal I want” which I think a lot of people accept is a skill to be honed that a lot of people have no interest and great training on.

Which is one of the reasons actually why Metaview tends to be so well received is we’re helping you with that side of things, not actually with the judgment. You’re just going to have more signal, more information with which to make your judgment.

[0:10:59.8] RS: Right, right and the important thing with people not being particularly good at interviewing is that it’s not necessary yet maybe to layer on some advanced kind of sentiment analysis, right? Like, “Oh, you left this person feeling slightly morose” you know? That isn’t what’s needed but there’s a couple other more obvious screws to tighten here, namely, like the dominating the conversation.

If someone is speaking more than the other or if they are not even really asking questions like, they’re saying statements that they expect someone else to expound upon, that’s a poor interviewer, right? You don’t need to have a very locked down process or like advanced training to even just improve those basic things, right?

[0:11:42.8] SM: Yeah, agreed and most of the advanced training that does occur on Metaview’s platform is viable with foreshadowing. So a lot of great companies, yet a handful of the sort of the big fan companies are known for this, we’re trained almost exclusively trained that interviewers by saying, “Hey, we’re going to schedule you in to listen two or three times to someone else’s interview.” So over the next four weeks, you’re going to be scheduled into these perfect interviews just to get to know how they do it and then, once you’ve seen how they do it, you’ll be able to do it just as well as them.

Really inefficient way to train people but a really great way to train people but really inefficient and even just having recordings for that sort of, the more expert level of how specifically do I interview in this context with these types of candidates that we’re hiring, actually the best teachers there are your current people and having recordings, having transcripts, having videos is just a great way to pass that on and that’s how you get again, that excellent level of training whereas the data is almost, is more of a safety net for the organization as a whole.

[0:12:40.4] RS: Yeah, even just having it on demand is so important because interview scheduling is such a nightmare and if you have to add another person whose only role is to learn from someone else then it just becomes even harder so yeah, having this like database, like, “Oh, here’s our best interviewer, go check out their interviews.”

You said that a lot of it is just the talent leadership knows who their good interviewers are and they want to kind of clone that sort of experience. How do you think they’re arriving at that decision? With the people who are using your service, I’m sure you’re having this leading conversations to be like, “Hey, put your best interviewer on this and then you know, tell the rest of your team to go forth and do likewise” what are they sharing with you about how they identify who the really good folks are?

[0:13:20.5] SM: We actually helped our customers with that. So, as part of a Metaview implementation, we integrate with your existing applicant tracking system. So what that means is we know what the application tracking system knows. The things that applicant tracking system know that are really interesting to us are things like, which of your interviewers tend to be well calibrated when they say yes for a candidate.

Sort of the net, the final recommendation of the candidate tends to be here and when they say no, the final recommendation tends to be no, who are the people who have interviewed a lot for a certain role. I have lots of reps at this over a long course for a long time. Who are the folks who when they interview someone, the candidate is more inclined to accept the offer than to reject the offer.

So, there’s all these clues you can pick up that are actually are in the data already. Then there is a bunch of data that we bring to the table once you start capturing your interviews because we know whether they are sort of falling into any of these bad habits or they seem old. So that cuts down to this even further of potential folks who do hold up as a great interviewer and then the third is just the tribal knowledge, you know?

If there is any number of ways that people have landed at that observation but when you mx that sort of tribal knowledge that certain leaders in the company have along with these two set data sets that I mentioned, Metaview’s data sets and the ATS data, we usually end up with a pretty good guess or a bigger insight as to who the top interviewers in the company are.

[0:14:35.4] RS: Right now, this is primarily used for the interviewer’s sake and not really used to evaluate the candidate at all, is that right?

[0:14:43.1] SM: Correct. People of course use it, the transcript itself has many uses. People use it for all sorts of things whether that is self-reflecting or checking what the candidate said if they didn’t hear it or if there is uncertainty about a decision or if they’ve got two candidates and one has a better resume than the other but they can’t remember what happened in the interviews, they would actually go to the game play as opposed to just, you know, sitting back on resume bias.

So there is obviously cases where using it to help with your candidate decision that does that happen but the actual functionality above and beyond the transcript is purely focused on helping interviewers do a better job.

[0:15:15.6] RS: Do you see that as a possible future product offering and maybe even in the terms of a, “Oh, this person was adversarial in their responses” we can tell by the language of the words they use what kind of person they are or what their tone was like just based on evaluating their responses?

[0:15:31.6] SM: We’re staunch believers on the power of the data within interviews that will forever be true. The specific applications of how and what we can is a much powerful way to use this data, we’re very customer-driven. Whatever the customers want to use the data for is where we get excited about helping them achieve. We want organizations to be able to populate themselves with the best people.

We think the best way to do that right now is by helping interviewers elicit much more signal and give candidates a much better experience during the interviews themselves but in time, there will be other things too and that’s a good example. That would be an interesting example. Often lots of people when it comes to the product as a whole though is we’re augmenting the humans in this interview loop.

The type of hiring, the type of companies that we work with, very high sealed, high performing, ambitious companies often hiring for creative roles. What we are not trying to do is get to a point where we predict who will be a fit for this exact role. We are just trying to augment people’s ability to run great interviews and make great decisions. So as long as the decision remains to organizations and that human beings within the organization’s decision, then everything is on the table for us.

[0:15:31.6] RS: Could you share an example of ways that customers have used the tool in maybe a way you didn’t expect to? Of course, you know this is extremely rich data but that it is just up to the creativity of the user to figure out how it’s going to inform their process, have people use it in unique ways you didn’t expect?

[0:16:51.0] SM: Yeah, so there have been a couple of things but probably recent actually. One that I thought was super interesting was we had a customer that was reviewing who their most common interviews were and one of the data sources that we’re using was performance data. People who are performing weekly within the company, not very strongly within the company in general, nothing to do with their interviewing.

They thought it should not be involved too much in the interview loops and in one of these cases actually, the person that ended up was actually a very strong interview. So it turned out that wasn’t a good call and actually there is not sort of a linear correlation between being a top-performing in the organization and being a greater hiring, which is a really interesting distinction if you think about it, right?

Because everyone who joins these companies, they all – one of the most important things they’re doing is helping us hire the next generation of people at this company, yet when the person was interviewing that performance, when their performance reviews did not consider as hiring data. So they would see the low performer but actually they were from [inaudible 0:17:41.6] at higher interview in that hiring, so that was a really interesting observation.

Another that just came to mind is we obviously had this expectation and it happens all the time that hiring managers sort of find decision-makers on hire would often review interviews of candidates that they’re seeking to hire that we had with other people in the interview loop. Maybe if they have certain question marks or certain attributes of the candidate, they would listen to the other folk’s interviews.

What is actually just as common, which is surprising is the inverse, which is interviewer speaks to a candidate, it’s really nice about them, hiring manager, the often more senior person within the organization interviews them and decides to pass and not hire this person. Often we are seeing the relationship where they will ask the hiring manager to be able to review the interview, wanting to calibrate in what this hiring manager is looking for so they can potentially be educated themselves and change their approach recalibrated on what great looks like.

But actually just as frequently, to push back on hiring managers and say, “I think we’re making the wrong call there.” So it is a characteristic of the companies we work with should generally pretty merit to practice and have a very good growth mindset. So I think that is why it is happening a lot but it is just really interesting that it is actually affecting just as much as the reverse, so the hiring manager checks the conflicts.

[0:18:53.2] RS: Yeah, it’s interesting and the idea though like back to the first thing you mentioned that the poor performers are not necessarily poor interviewers and also the opposite like your top performers are also not necessarily good interviewers. That to me is jarring because I’m sure most companies select their interviewers based on performance, right? They’re like, “Oh you’re the best person on our team, you’re our high performer on team therefore, we should have you in those meetings.”

But it is so obvious like not that I am saying it out loud but yeah, being really, really good at software engineering for example does not make you really good at interviewing. It’s completely different skillset. So is that somewhat common? Are people finding that their highest performers are not their highest performing interviewers?

[0:19:34.6] SM: That happens a ton. The way that Metaview works doesn’t result in a ranking of everyone in the company and how good they are at interviewing. It is more about identifying where the – I love the phrase that you use, where we can tighten some screws like what is the easiest screw to tighten I guess. So it is not like you have this ranking but there is definitely these observations that people have because as they’re using the platform, they realize that certain people who are running a ton of interviews maybe seen as almost like the final boss in an interview process.

I’m actually so hot at interviewing and I think the reality, there is a few things that are at play there. So why do companies often lean on the same people again and again to want interviews. One is some form of consent ability, “Hey, this is one of our top performs. So if this person says this person is good then at least I won’t look stupid if I hire them because they said it was okay too.” It is not actually because it’s a mechanic that those people value their opinion. It is sort of like a – it’s again, it’s a safety net. It is a –

[0:20:26.5] RS: They are trading on the equity of a high performer, yeah.

[0:20:28.5] SM: Yeah and at the same time for this high performer who may well, I mean look, I think everyone has the ability to be a great interviewer. This high performer sure as hell does have the ability to be a great interviewer but they’re getting onto the interview all the time. So of course, they are suffering from burnout. I’m not conscious of their lack of ability to run great interviews because they’re just doing it too much.

So there is a ton of variables that are at play but you definitely see those characteristics. Some of our most sophisticated customers actually operate on model where we will flag to them when interviewers are essentially running too many interviews. So if they are running different customers, have it configured at different levels but essentially, have a couple of my customers if there is one interviewer who is running more than three interviews per week or more than four interviews per week, then it’s flagged to them as this is too many and we need to essentially unlock other people to learn how to run this interview and that is where Metaview comes in.

So yeah for sure, that interviewer burnout definitely affects quality, just often their most trusted people.

[0:21:24.3] RS: So the interviewer burnout you said is like three or four per week so is that just purely the amount they are doing or do you measure like depreciating interview quality?

[0:21:33.5] SM: We do measure the characteristics of the interview and you see depreciation over time. The main depreciation of interview quality, which the best measure we have or the most predictive measures we have were around essentially a combination that’s formerly we have that attaches talking time with question counts and distance between the questions. That is sort of the formula we have that works them out.

The biggest thing that impacts it is actually number of different interview types that anyone is responsible for in a given week. So if you are running multiple of the same interviews, you do get burnout. We sort of anecdotally get that feedback that people are sort of suffering from. You don’t actually see it in the data so strongly. So actually it looks like what’s actually happening is that they’re gritting their teeth and they are getting it done, they’re still doing okay.

Where you see real decaying in those key metrics is when you are running three or four interviews per week and they’re actually three or four different interviews and you’re having to just like recalibrate in what am I trying to do in this interview and actually you end up not really making the switch and not really doing a great job. So that’s a completely novel insights that again, people might have been able to guess that that would be true.

But to actually now see it in the data that yes, when you are split across various different interview types, the quality decays of every single one is a really compelling insight that affects your ability to hire a great team.

[0:22:49.5] RS: I would have never guessed that but maybe that is because I am a naïve podcaster and not a professional talent executive but yeah, wow. So if you interview for different roles, then you are less likely to perform higher than you just interview for one single role.

[0:23:05.7] SM: Exactly and to be fair, not different roles but also if you look at the interview type level. So even for a given role, you have five or six different interviews. They have different interviews still, right? So it is greater when you talk about a different role that sort of amplifies how differentiated the interview is but even different interview types within an even interview loop that you see degradation of quality.

[0:23:26.1] RS: Do you have any idea why that is, what’s your theory?

[0:23:27.9] SM: My theory, this is completely personal. I’ve actually ran a ton of interviews both in my past life and all of a sudden growing Metaview is just as switching costs. When you prep for an interview, which often you only give yourself a few minutes beforehand, if you are having the same interview again and again and again, you sort of like almost get these economies of scale almost because you are doing the same thing.

So your need to prep for the next one is not so high because actually you’re just coming the back of one [inaudible 0:23:51.9] or if you need to pull in somewhat refresh your memory. Actually when you go into your next one, you’ve got muscle memory and you’re doing the same thing you are doing previously but you are realizing during the conversation it is not getting the type of answers you’re expecting and actually a lot of the follow up questions that you might often fall back on are not appropriate in this case or not appropriate based on the types of responses, this type of a candidate for a different role might be getting. So it is just because conversations are fluid, it’s really as simple as that.

[0:24:16.5] RS: Yeah. It is contact shifting that’s hard and it is hard for anything. It’s like why they say you shouldn’t multitask, why they say that like if you get distracted from some task you are doing, it’s not just the time it takes you to like go back to the tab you’re working on. It’s like an extra sentiment. Figuring out where you were and then interruption and multitasking is the enemy of being really productive.

The same thing it sounds like holds true for interview prep and executing an interview. Siadhal, we’re creeping up on optimal podcast length here. Before I let you go, I want to ask you short of, of course, buying and implementing Metaview in their company, what are some things talent leaders can do or people who are conducting a lot of interviews can do right off the bat just to make sure that they’re having higher quality interviews?

[0:24:58.0] SM: Yeah, I think there is probably two things that one can do alongside Metaview helps with but frankly you know, just important whether you decide to go down this route and that route or not, which is one, truly understand the competencies you’re looking for. So I think there is a lot of focus, a lot of talking in circles about know what questions you’re going to ask. Yeah, that’s true and having at least a skeleton of a script upfront one is you can definitely be useful.

But the most important thing as an interviewer and actually as a hiring manager is responsible for that role and creative vision is involved in that role too is to truly understand the competencies you want to know about by the time you leave that interview. So we want to know if this person has the ability to lead and motivate a team. If you internalize that mission, that is your mission in this conversation, actually the questions come easy and I think the script is the easy bit.

It is really knowing that’s [inaudible 0:25:49.6] so really understanding what it is you’re looking to understand and are they human beings in front of you during this conversation, that can be had – you’ll learn that by talking to the hiring manager coming to the interview or whoever it might be, so that’s one and then the second is if you’re literally unsure about when is the right time to nudge someone along in their answer.

When is the right time to interrupt someone during their question, what is a polite way to follow up and say if maybe the candidate didn’t answer in the depth that you wanted and you feel a bit rude saying, “Hey, can you say again these, that didn’t make sense” what are some polite ways to follow up and get the information you need, then send a message to some of the more experienced people in the company and say, “Hey, I’d love to listen into one of your next interviews. I’ve got to see how you do it.” I can’t stress enough at seeing it in action is a really great way to learn from others.

[0:26:31.9] RS: Yeah, I would be interested in those last few things, how to properly do that. I didn’t mean to do it at any point in this conversation anyway because you were a fantastic guest. So Siadhal, thank you for being here. Before I let you go, one last question, can we run this conversation through Metaview and figure out how good I am at interviewing or not as the case may be?

[0:26:49.6] SM: We certainly can, yeah. I am looking forward to getting that out. That will be great.

[0:26:53.2] RS: All right, stay tuned for that podcast land and in the meantime, Siadhal thank you so much for being here. This has been a fantastic conversation. I really love learning from you today and it sounds like you have an awesome company that’s really, really valuable to talent pros. So thanks for being with us and sharing today.

[0:27:05.7] SM: Awesome Rob, thanks so much for your time. Great chat.

[END OF INTERVIEW]

[0:27:10.6] RS: Talk Talent to Me is brought to you by Hired. Hired empowers connections by matching the world’s most innovative companies with ambitious tech and sales candidates. With Hired, candidates and the companies have visibility into salary offers, competing opportunities and job details. Hired’s unique offering includes customized assessments and salary bias alerts to help remove unconscious bias when hiring. By combining technology and human touch, our goal is to provide transparency in the recruiting process and empower each of our partners to employ their potential and keep their talent pipeline full.

To learn more about how we can help you find your next great hire, head to hired.com/tt2m.

[END]