All Episodes

March 10, 2025 31 mins

Send us a text

The construction industry is at a crossroads as it embraces the transformative power of artificial intelligence (AI). In this insightful episode, Shawn Gray delves into the pressing need for the construction sector, especially mid-sized firms, to engage with AI solutions that can substantially enhance productivity and operational efficiency. As labor shortages and demands for faster project delivery become critical issues, AI might just hold the key to unlocking greater profits and smoother project execution.

Shawn elaborates on how many firms are exploring AI but less than 4% are meaningfully adopting it. He shares real-world examples of how mid-sized companies are already using AI to automate time-consuming tasks, thereby redefining workflows and allowing teams to focus on higher-value activities. We discuss the unique challenges faced by field teams who are often reluctant to embrace new technology due to past failures, and how AI can bridge the gap by offering intuitive solutions that speak directly to their needs and pain points.

Listeners gain a deeper understanding of the significance of not just adopting AI tools, but integrating them in ways that significantly impact on-site productivity. The episode underscores that the risks of ignoring AI or not supporting its adoption can jeopardize a company’s competitive standing in an evolving market. This enlightening conversation is a must-listen for construction professionals eager to understand how innovative technologies can yield tangible benefits. 

Join us as we explore the best practices for AI adoption in construction and the profound implications it holds for the industry's future. Don't miss out—subscribe, share your thoughts, and contribute to the conversation!

PODCAST INFO:
the Site Visit Website: https://www.sitemaxsystems.com/podcast
the Site Visit on Buzzsprout: https://thesitevisit.buzzsprout.com/269424
the Site Visit on Apple Podcasts: https://podcasts.apple.com/ca/podcast/the-site-visit/id1456494446
the Site Visit on Spotify: https://open.spotify.com/show/5cp4qJE5ExZmO3EwldN1HH

FOLLOW ALONG:
LinkedIn: https://www.linkedin.com/company/thesitevisit
Instagram: https://www.instagram.com/thesitevisit

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to the Site Visit Podcast leadership and
perspective from constructionwith your host, James Faulkner,
Recorded live from the showfloor at BuildX Vancouver 2025.
All right, Mr Sean Gray, aveteran of the Site Visit

(00:26):
podcast.

Speaker 2 (00:26):
Good to be back and my butt groove's still in the
chair here.
Great it is?

Speaker 1 (00:31):
It is.
I think we met here for thefirst time, correct?
We did yeah, yeah, yeah, Acouple years ago.
I remember you were.
I wouldn't say that you were.
There's a certain type ofintimidation that you had that I
felt a little bit intimidatedby you in the beginning.

(00:53):
And I'll tell you why it was.
You were very serious.
I'm like, oh, but this guy islike he's out to get it, and
that's cool.
Though, and we've gotten toknow each other over time and
I've got a lot of respect foryour work ethic and you're very
focused on things.
So, very cool, it's good toknow you.

Speaker 2 (01:11):
Well, we're dealing with serious things here.
We are.
People need houses built, weneed critical infrastructure
built, and we don't have enoughpeople to do it.
It's pretty serious, yeah, sowe can have a laugh.

Speaker 1 (01:21):
Well, we can have a laugh, but you're right, it is
serious.
There's a lot of things goingon.
So let's just.
You are the founder and CEO ofConstruct IQ.
I would call you the metricsguy.
You used to have a lot ofstatistics over time If
anybody's listened to any of theold podcasts you had some
awesome things that we coveredlast time, but today we're going

(01:43):
to cover what your latestinterest in is AI and AI
development and construction,where you see things going,
opportunities and how you seethis playing out.
So first of all so I think youand I are probably aligned on
this you see a lot of adoptionof in a lot of the larger firms.

(02:07):
They've got budgets, they'vegot lots of bandwidth to hire
people to be able to just focuson something, make good revenue.
They're just balls to the wall,busy all the time, not
necessarily paying attention towhat opportunities they have in

(02:27):
front of them and are probablynot utilizing to its full extent
exactly how much moreproductive they could be.
So maybe just take us throughwhat your findings are in terms
of you call it going from zeroto one in terms of this stuff.

Speaker 2 (02:45):
Sure, yeah, and thanks.
And just for context of theaudience here because you're
talking about AI and that is themost salesable topic right now
just to the audience to knowthat you're talking with
somebody who's walked the walkon meaningful implementations of
AI across $25 billion ofconstruction over the last 10

(03:05):
years.
So I know this is someone who'swalked the walk on that, so I
just want to make sure we setthe stage on that for this
conversation.
When we say going from zero toone, I'm talking one being an
instance of significant valuediscovered for the business in a
state where let's throw somestats at you now in a state

(03:27):
where over 40% of industry isexploring AI and what that means
behind that, almost 85% ofhuman individual beings are
using some shape or form AIsolutions in their personal
lives or even in work.
Most of them are using thoseThanks to X.

(03:49):
Yeah, yeah and other things, butthere is when we say going from
zero to one.
It's how we're using thesehigh-powered tools In most use
cases, barely scratching thesurface of the value that they
can actually bring to a businessin terms of addressing the
fundamental constraintspreventing them from being

(04:09):
productive and profitable.
So when we say going zero toone, 40% of industry is actively
looking at these things.
Less than 4% of them are usingthese AI type of tools in a
meaningful way.
I'm talking about.
Hey, if you're using thesethings for reading and writing
emails more efficiently, do that?
Amazing.
That's an incremental nugget ofan improvement.

(04:32):
It's not major needle movingvalue.
So when we say zero to one,it's going.
Where can you be deploying thesethings to truly address the
bottlenecks preventingproductivity and profitability,
especially at the job site level?
And then, when we wrap thisaround those mid-sized firms,
over the last year I juststarted this real intense

(04:53):
journey at the time we chattedlast, in the last year that was
really focused around thosesmall, mid-sized firms because,
as you said, the big groups.
They have data teams anddigital teams that eclipse 10
times the size of employeesthese guys have.
So bringing these things in andhelping them understand that

(05:13):
they can be as you said, yousaid it.
These small firms are, you know, the scrappy groups.
A lot of them are establishedon their 500 million dollar
companies.
Yeah, some of them are 10 or 1million.
More often than not, everyone'swearing multiple, multiple hats
and their ability to do goodwork or take on more work is
just capped by their humanability to produce.

(05:36):
So in this last year, 40%.
So I engaged over 250construction professionals.
It was a busy, busy year, notmuch sleep, but we were on a
mission.
40% were actively looking atthings.
Less than 4% were reallydeploying it in a meaningful way

(06:02):
.
Almost all of them, whetherthey said they were looking at
it or not.
Almost all of them werehesitant to actually go forward
with anything just because ofhow they've been burned before
on previous initiatives.
We've entered an age of kind ofa double-edged sword.
Here.
We've got amazing ecosystem oftechnology, especially in
Western Canada here over 150technology providers, most of
them now providing some type ofAI enabled solution.

(06:24):
We're number three in the worldfor construction R&D, but it's
an emerging technology.
There's no real established usecases or library of who's used
these and where and when.
So you've got folks that wantto use them, but there's no
evidence that these things areworking well in certain areas.

(06:44):
So it's a weird chicken or theegg situation that most industry
is in and what I've looked atwith these midsize firms their
desire to do more work with lessis tipping them over that point
.
They're embracing these newAI-enabled opportunities even
before they havewell-established centralized
platforms and things like that,because you can provide

(07:07):
immediate value to a job sitestakeholder.

Speaker 1 (07:10):
But what would that be?
Give me something specific.

Speaker 2 (07:14):
There's been some interesting, lots of different
use cases, but what's kind ofinteresting about these things?
A recent example from a highlyinnovative mid-sized group in
your neck of the woods here theystarted by introducing AI
agents just for the purpose ofautomating meeting minutes on
meetings and on projects.
Very trivial, actually didn'tadd too much value, but what

(07:37):
that did was it fundamentallyshifted the job function of the
project coordinator, whotypically would have been
recording notes.
They've now shifted to pullingup their BIM models, pulling up
RFIs, pulling up relativeproject context in that meeting
to make those meetings moreeffective.
I'm going to tie all thistogether because these are all

(07:58):
AI agents, so they're allintegrated.
Yeah, so automating meetingminutes, the job site
superintendents, safety officers, quality managers being
prompted throughout the day onkey areas of activity or risk so
that those field folks can beputting in via text or voice

(08:22):
what is actually happening onthe project during the day.
Being prompted to go payattention to certain things on
any given day that they may ormight not know about is critical
to the project.
We know there's so many firesto fight, so helping them kind
of understand where to bedeploying their time and then
when these agents, on thisspecific example when we had

(08:45):
these agents deployed inmeetings, deployed at the field,
getting operator field levelinformation because they're
integrated, they started tounderstand a lot more project
context between the differentclay layers and they started
you're able to build a lot morerisk-based or contextualized
tasking of activities during theday, a lot more risk-based or
contextualized tasking ofactivities during the day.

(09:05):
That has been a fundamental whenwe look at the headaches that
our field leadership have.
A lot of the time they're justresponding to people's problems.
What's happened really, reallyquickly in a matter of weeks.
Having these agents deployed,it's really helping them focus
their time on really whatmatters the most and it's

(09:26):
impossible for these small, leanteams to be cascading
communicating the rightinformation around and almost
overnight these agents aretaking on that task, so I think
that where the confusion sets inwith a lot of companies is the
confusion sets in with a lot ofcompanies is like, even when,

(09:48):
when we start working with acompany, they say where's my
data?

Speaker 1 (09:54):
Well, what?
What service is it?
Is it in the U S?
Is it here?
Where is it?
So you know, we give theminformation on where their data
is.
How do I get that data?
Make sure you're not sharingthat with anyone else.
So the notion that theirinformation is going into an

(10:14):
agent model, somewhere that isnot private, how does that work?

Speaker 2 (10:22):
So that is usually the first point of learning with
these groups, because that'susually what they're concerned
about the most, a lot of itdriven because of media-based
dialogue, but quickly overcomewhen they understand that I mean
there's a difference betweenthose open AI, chat, gpt type of
systems and systems that areproject or business specific

(10:46):
instances.
So that is usually the numberone learning.
It's closed loop, it's builtfor you.
There are backend algorithmsthat do things, but it's just
functional.

Speaker 1 (10:57):
Where are those backend algorithms Building the
AI comes from?
The models have to come fromsomething macroly, because
you're not building thoseyourselves.

Speaker 2 (11:11):
Some are building those themselves.

Speaker 1 (11:15):
I mean, I've gone through this with makecom, for
instance.
You go there and but that'sgoing into a big database of
stuff.
So what we've been looking atis that we would have two
specific databases.
We basically would have one forcapturing the data and then you

(11:35):
have that information,depending on what the customer
wants, then ports itself intoanother database for analysis,
because trying to crunch it fromthis with all these different
schemas is a gong show.
So you basically have to, andyou can have agents in between
that are going to be doing somethings.
Yeah, so I think that I waslooking at the idea of agents in

(12:02):
a construction company, an AIagent, for instance,

(12:23):
no-transcript, where does thatchange?

Speaker 2 (12:28):
So that's one of the initial use cases for AI.
When we look at, I would saywhat does AI even mean?
I would say automation andintegration.
That's what it is right now formost groups.
So that part we're going.
Hey, you've got theseclosed-loop systems.
That's what it means to them.

Speaker 1 (12:45):
You mean Rather than artificial intelligence.
No, I tell them that.

Speaker 2 (12:49):
Oh, I get it.
I said don't think aboutartificial intelligence, because
these things aren't actuallythinking.
They're based on prompting andnumber ones and zeros.
They're not thinking.
But I would say, for thehighest-value use cases for you
right now, think of it asautomation and integration.
I see okay, and keyword is andDo them both.

(13:11):
So those closed loop systemsand platforms and some of them
promise and sell that they areamazing integrators with all the
other ecosystem, when we knowthat that's been validated as
maybe not true or that's a bigpain point of the community
these days.
So that is a function thatthese agents are playing.
It's going.
Hey, you can now bring in jobfunction specific what we used

(13:33):
to call point solutions, jobtask specific solutions and have
agents running all of thedifferent integration pathways
between them.

Speaker 1 (13:41):
Do you know what the big gong show is?
Here Is the Google Play andApple App Store.

Speaker 2 (13:48):
Elaborate on that.

Speaker 1 (13:49):
Because they're the bottleneck.
You can't just have some.
They have to test your API inorder to pass your app
Interesting.
So it's not like you can justput something out there with
some totally open-ended blankslate application.
It just won't get passed Right.

(14:11):
So that's what we've beenthinking about, these things as
well.
So where integration getsinvolved is where, if you're
talking about multiple APIsthrough the App Store, for
instance, apple's the worst.
I mean, they're good in someways because they're very

(14:31):
controlled, but they're the mostdifficult to do.
So if you were to have, forinstance, if you had one app
that was going to integrate onthe fly to make that really work
there on the mobile side, toget fast time, because you're
talking about offline, you'retalking all this.

(14:53):
It's a gong show to actually doin practice.
What's easy to do is if you aregoing on your own on the web,
because I have a feeling thatweb applications are going to
see a huge resurgence heremobile, responsive web
applications because of this AppStore restriction, because you

(15:15):
can go real-time right away,start interacting with models.

Speaker 2 (15:22):
Absolutely.
The web apps are where you'regetting the most horsepower
right now, because of that exactstatement you made.
These are internet based.
Well, let's call those more theopen models.
It's internet based informationgathering and moving, so it has
the best ones.

(15:42):
The highest value operatingapplications are web-based right
now.

Speaker 1 (15:47):
Yeah, they would have to be, but that means no
offline.
Well, there is no offline.

Speaker 2 (15:54):
Let's say you can maybe paint an artificial
viewpoint of what that means tobe offline, because some of
these agents do things, let'ssay, in that offline state where
I know what you mean.

Speaker 1 (16:10):
But offline on an app , at least you can get the UI.
You can't even get the UI on aweb app without connection.
But no, I think this is veryinteresting having this
conversation with you because itis.
I look around and I see Sagehere and they're not really
talking about much stuff.

(16:30):
They are here kind of maybestill talking about the same
things they were 10 years ago.
Good company, we're a partnerwith them.
But yeah, it seems like thereare solutions that are people
with no customers who are tryingto do things.
That it's too.
It's very risky.

(16:51):
It's very risky to startgetting into doing some of these
things and the the processes inconstruction is things are
going already Like even we sawthis when we saw the
transformation from paper todigital For a while.

(17:11):
They had to do both because theyweren't sure the digital was
going to work out, so they stillhad to fill the piece of paper.
Now, let's say they go fromdigital to automation or AI.
They now have to make sure thatA it's not making any mistakes,
it's an A, and if it is makingsuggestions, that it's not
putting people in a wild goosechase either.
But shit, that doesn't matter.

(17:31):
So there's an interesting placein between.
I think that is where therainbow to the pot of gold is.

Speaker 2 (17:40):
It's right in the middle, and I agree with you.
And even when you said thesedifferent companies and
platforms are or aren't talkingabout AI.
Not everything needs to beabout AI, and that's kind of
what industry is trying tofigure out right now is where
are the right applications forthis.
But then when we're talkingabout what is that magical area?

(18:04):
It is around more like those.
I always help the groupsunderstand.
Where are these areas that arelower risk?
You're not making a majorcritical decision out of this,
but where is something that,especially around let's call it
the value of this?
But where can you?
Where is something that,especially around what's called

(18:25):
the value of time?
Where is something that can betake a significant amount of
time away from your task, thatcan impact and have butterfly
effects to many, manystakeholders in the easiest way
possible, that doesn't diminishthe confidence or require
somebody that has 50, 40 yearsof experience to know is this

(18:46):
the right call or wrong call?
So that is where that, exactlyas you said, that is that pot of
gold at the end of the rainbow.
I think some of us have beendoing this for a bit, but even
just last year, these groupsthat have kind of embarked on
that avenue.
They're finding those immediatepots of gold of value, and it's
not from them coming up withideas.
It's these things.

(19:07):
You have to use them, you haveto have those.
Especially the fieldstakeholders discover those pots
of gold.

Speaker 1 (19:13):
Do you find that the field is probably the most
difficult?
It's a weird environmentBecause the office seems like it
would be simple.

Speaker 2 (19:18):
It's a weird environment.
The office folks are actuallysome of the most jaded right now
because they are almost theactual most tech fatigued
because they've been dealingwith office-based products for
almost 20 years in the industry.
The field groups I think wetalked about this maybe last

(19:40):
time.
The field groups they're sickof being forced technology that
is really meant for data captureto help somebody else out in
the office.
So when you bring in, they'reactually a weird stakeholder to
work with, where they've beenjaded before, they've been hurt
before by these initiatives.

(20:02):
But coming in an approach goingno, this is now about you,
everyone else has the tools orshould have the tools that they
need.
This is about you.
How can these add value foryour job?
And putting a lot ofequilibrium back to that
ecosystem where you've broughtin all that technology to help
somebody else out.

(20:22):
The ecosystem's unbalanced.
So it's interesting how youbring things in that were
directly resonating with a fieldstakeholder's job function.
It balances the equilibrium outimmediately.
We did an interesting project.
This early stage startup groupwas doing some interesting
report automation things, textto voice type of things.

(20:43):
The chief safety officer thinkabout the most stereotypical
cliched construction personkicked the startup off the job
the first day Within four weeks.
This CSO was the championbecause he understood.
He saw right away that we justautomated LEM, labor counts,

(21:07):
capture from the trades, thosetwo hours of this guy's day like
walking around hunting thisstuff down, then transferring it
immediately.
This person was like well, thistook me no time, I didn't have
to do anything.

Speaker 1 (21:19):
So where did the inputs come from?

Speaker 2 (21:22):
Right from the trades .
So trades via text or voice,using AI prompts to their mobile
platforms Right, but they haveto download an app to do that.
No, just well, I think.
On this instance I think theyuse WhatsApp just because that
was a common job communicationtool on that project, but it can
you know via SMS as well.

Speaker 1 (21:42):
I see okay.

Speaker 2 (21:42):
Yeah, so, but this would have been typical high
resistor was the champion at theend because it was like this is
for you, this is to help youbring to save time for you.
So it's an interesting, weirdthing where the field groups who
had the bad rap of being theworst to deal with they're

(22:04):
almost the most hungry formeaningful applications that can
help them perform better.

Speaker 1 (22:08):
Yeah, I mean I have.
You know, obviously, theposition that I'm in.
I've been thinking about thisdeeply and I definitely have
some.
We're seeing history repeatitself, just in a different

(22:36):
paradigm, but it's the sametransformation when we first saw
the iPhone, mm-hmm.
To me it's.
It's the same transformation,going from a baked keyboard to a
not-baked keyboard.

(22:57):
So I mean, I see it prettyclearly and I think that there
is and this is mostly to fix thefield, because the field is
where this, this is where it'sthere's crazy inefficiencies.
You also have to have uh, yougot a revolving door of staff
too, and you have all the waydown to the lowest common

(23:20):
denominator of the bring yourown device situation.
Because, as you said I think Ithink it was on the one, the
podcasts you said that a highpercentage of the revenue of
construction is in projects thatare small, right All over North
America.

Speaker 2 (23:35):
Yeah, Nine percent of well, let's call it over 90
percent are smaller mid-sizedbusinesses, Exactly.

Speaker 1 (23:42):
Right.
So this isn't the building, thehospitals and airports.
I know I say that a billiontimes but those big projects,
institutional projects, that'snot where all the money is.
A lot of the revenue is beingmade.
It's a huge amount of revenue,but the projects are very few
and very big.
But the smaller the companygets, the less organized it is,

(24:06):
unfortunately, the diversecaliber of worker you're going
to get.
Of course there's edge casesthere, but on balance it's going
to be a different kind of deal,right, and having software that
is going to utilize these AItools or automation tools that

(24:29):
field part it's not going toreally the office stuff is going
to be ubiquitous.
I mean, from doing a scheduleyou're going to have automation
prompts and commands withinsoftware.
Take the schedule I had lasttime, attach this to read the
blueprints and give me adifferent schedule based on that
.
That intelligence is going tohave to come from somewhere.

(24:53):
I don't know where those modelslike you know they say, well,
the AI, what AI?
Which one Like?
Where is this?
You know who, what platform isreading this, what has enough
servers and enough bandwidth tobe able to crunch this?
And it has to learn somehowright, and it can't learn the

(25:15):
wrong thing.
So you have to train models.
You can't just like they don'tjust like suddenly start working
.

Speaker 2 (25:23):
It's interesting when you talk about they have to
learn from somewhere In theearly days of this journey last
year.
Any seminar I've done usuallyis about field productivity.
Most of the audience is frompre-construction and wanting to
know how can they improveestimates and things like that.

(25:43):
My first answer to them is didyou not listen to the entire
seminar about the fact that whatis happening during project
execution is what is driving ifyou're going to make or break
your estimate of profit?
That is where you need to fix.
We're not going to go talkabout automating estimates other
than maybe takeoffs and things,but we're not going to talk

(26:04):
about predicting a projectoutcome because we already know
what it's going to be.
It's about, as the theme of thisconversation, it's fixing what
happens during execution andwith the hypothesis of if the
people that have the experienceto know how to proactively
prevent challenges are busydoing something else, whatever

(26:25):
that something else is free,that so they can go improve the
project.
Only, until you get to thatstate, we're talking even a 1%
improvement.
I think at the highest level,$8 billion GDP.
Industry 1% improvement inproductivity is $1 billion.

(26:45):
Yeah, so we go just give it anudge, just a little nudge that
will translate into majorproject and profitability
improvement.
Then start looking at somepredictive performance on the
front end.
But you can't fix somethingthat's that's broken until you
fix the right spot.

Speaker 1 (27:03):
So we're in the sticky middle right now yeah
yeah, and I think, uh, I thinkwhat you're doing it makes a lot
of sense.
And I think, uh, yeah, therethere's lots to be done.
I think, well, we can talkabout this offline.
I hate to say this to thelistener, but I'm not going to
give away everything I've beenthinking over in the past number

(27:25):
of months, but definitelysomething we should talk about
more.

Speaker 2 (27:39):
What do you?

Speaker 1 (27:40):
see as some of the negative aspects to this.
Negative in what context?
What risks are there?

Speaker 2 (27:45):
There's obviously opportunities, and then there's
risks too.
Well, I'll throw a weird riskat you.
I feel the biggest risk isaround groups that aren't
actually actively engaging inlearning about these and
exploring, because this isunlike traditional platform era
where you press a button and itgoes the next step and does
something the next generation oftools.

(28:08):
You have to engage with themand you have to engage with them
for them to build and learn andget better.
So maybe this is a differentarea of risk that you were
trying to flesh out, but I seethese tools are things you have
to use for them to get better.
The biggest risk that I'm seeingfrom a business is not just
where is the data?
What's the safety around thatit's?

(28:31):
Are you looking at these things?
Are you people using them in amanner that adds value?
Because if you're not, the riskis that these groups just as we
saw with you know, wearableaugmented reality devices,
things that probably could havechanged the game people didn't
pick them up and use them.
Those companies folded,collapsed and went somewhere
else.
So I find actually the biggestrisk isn't what the AI can or

(28:58):
can't do things around thatnature.
It's the fact that you have tobe using them.
You have to be supporting thecompanies that are building
these things or they will goaway.
And what's interesting, and thereason why we developed the
mission statement around thePrairie PropTech Association was
we have a number of amazingtechnology providers in this
Western Canadian ecosystem.

(29:18):
A number of them have left,either folded completely, or
have left to go down to Texas oroverseas.
Because of that, what we'reseeing is just that adoption gap
and I go.
That is the biggest risk.
We've got these things thatother companies have seen the
significant value where it'sfreeing up time and directly
translating it into productivity, which we need so much right

(29:40):
now.
But if they're not engaging andsupporting these groups, they
will go elsewhere and thenyou've lost that opportunity.
And that's just my take on.
Where is a big risk?

Speaker 1 (29:50):
If the risk is you don't use it, you're going to
lose it.
Well, that makes sense.
Okay, well, that's pretty cool.
This has been good, always agood conversation you know how
many more We've got to start.

Speaker 2 (30:04):
We've got a whole series coming up now?

Speaker 1 (30:06):
Yeah, I know.
So how do people get a hold ofyou?

Speaker 2 (30:09):
You can definitely check out the website,
wwwconstructiqadvisorycom.
Yeah, you can follow me onLinkedIn as well.
There's a major initiativehelping subsidize cost expertise
, that walk the walk on thesethings lower technology costs,
you know really helping thosemid-sized firms be the first
movers and establish value.

(30:29):
You can see that on my website.
Also check outprairesproptechcom.

Speaker 1 (30:34):
Cool Right on.
Okay, well, Sean, thank youvery much.

Speaker 2 (30:36):
Well, thank you very much.
Always a pleasure.
Yes, sir, thank you All right.

Speaker 1 (30:40):
Well, that does it for another episode of the Site
Visit.
Thank you, contractors in NorthAmerica and beyond.
Sightmax is also the enginethat powers this podcast.
All right, let's get back tobuilding.
Advertise With Us

Popular Podcasts

Amy Robach & T.J. Holmes present: Aubrey O’Day, Covering the Diddy Trial

Amy Robach & T.J. Holmes present: Aubrey O’Day, Covering the Diddy Trial

Introducing… Aubrey O’Day Diddy’s former protege, television personality, platinum selling music artist, Danity Kane alum Aubrey O’Day joins veteran journalists Amy Robach and TJ Holmes to provide a unique perspective on the trial that has captivated the attention of the nation. Join them throughout the trial as they discuss, debate, and dissect every detail, every aspect of the proceedings. Aubrey will offer her opinions and expertise, as only she is qualified to do given her first-hand knowledge. From her days on Making the Band, as she emerged as the breakout star, the truth of the situation would be the opposite of the glitz and glamour. Listen throughout every minute of the trial, for this exclusive coverage. Amy Robach and TJ Holmes present Aubrey O’Day, Covering the Diddy Trial, an iHeartRadio podcast.

Betrayal: Season 4

Betrayal: Season 4

Karoline Borega married a man of honor – a respected Colorado Springs Police officer. She knew there would be sacrifices to accommodate her husband’s career. But she had no idea that he was using his badge to fool everyone. This season, we expose a man who swore two sacred oaths—one to his badge, one to his bride—and broke them both. We follow Karoline as she questions everything she thought she knew about her partner of over 20 years. And make sure to check out Seasons 1-3 of Betrayal, along with Betrayal Weekly Season 1.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.