A Crash Course in Freight AI with Garrett Allen
Episode Transcript
DD Spotify DD Apple Podcast

In this episode, Blythe discusses the world of AI in freight with Garrett Allen, the co-founder of LoadPartner. Garrett discusses the current state of AI and how businesses can prepare for AI adoption. He also explains how AI can enhance logistics operations, addresses concerns about job displacement, and discusses the future of AI across the industry.

LINKS:

———————————————

THANK YOU TO OUR SPONSORS!

Are you experienced in freight sales or already an independent freight agent? Listen to our Freight Agent Trenches interview series powered by SPI Logistics to hear directly from the company’s agents on how they took the leap and found a home with SPI freight agent program.

Tai TMS is designed to streamline your brokerage operations and propel growth for both FTL and LTL shipment cycles. Book a demo with the Tai team today and tell them Everything is Logistics sent you

Maximize your website’s performance as a sales tool with Digital Dispatch’s website management.

Show Transcript

See full episode transcriptTranscript is autogenerated by AI

Garrett Allen: 0:05

I do think that, like I mentioned earlier, the generation speeds with companies like Grok are going to really make a difference, because right now, when you ask ChatGPT something, you have to sit there and watch it type its response out. Basically, it's pretty slow. With Grok, we're crossing into that. An AI can think faster than a person can think, and that just really opens up a lot of opportunities for real-time applications like robotics or even just better chat experiences.

Blythe Brumleve: 0:33

Welcome into another episode of Everything Is Logistics, a podcast for the thinkers in Freight. I am your host, Blythe Brumleve. We were proudly presented by SPI Logistics and we've got a great guest for you today. We have Garrett Allen. He is the co-founder of Load Partner, and we're going to be talking about what else? AI and freight. It's the topic of conversation. It seems like you can't really escape it in the news or in your personal life or anywhere online. Garrett, you're the perfect person to talk to. Welcome into the show.

Garrett Allen: 1:03

Yeah, thanks for having me on.

Blythe Brumleve: 1:04

Absolutely. Now, we were just talking a little bit about your career backstory, but if you could just give the folks a little bit of a glimpse into how you've come into the freight industry, because you're not just this new AI company that's coming in as an outsider. You've been in the industry for a while.

Garrett Allen: 1:22

Yeah, yeah, I think I started in Freight how a lot of people do. It was kind of fell into it coming out of college. Basically started out of freight brokerage really early on. It wasn't really higher Cincinnati, Khio's where they're located. I was like the second IT guy they hired there. There was like 50 people.

Garrett Allen: 1:40

Long story short, small brokerage. They started seeing a lot of success as they were trying to scale. They were having issues with their technology, offered to hey, if you guys give me the right resources, we can build you a TMS and help this company scale. They ended up betting on me. I built an engineering team. We built their TMS and worked there for just under a decade I think like eight or nine years building that TMS. We launched it after a couple of years of work, built the whole engineering team and left there. Like I said, about eight or nine years in left they actually just crossed over like 500 employees and a billion dollars a year in revenue on the TMS we built, learned a lot scaling there, just how a brokerage grows and the problems you run into. That was really great. Then, from there, went and spent a few years at a digital freight brokerage the more of a tech focused side for a couple of years and learned a lot there. Yeah, after that moved on to what's now Load Partner.

Blythe Brumleve: 2:34

That's awesome. You built a TMS and you are still alive, to tell the tale. How challenging was that entire process. How long did it take you? I imagine it took. If it's anything like they've got their companies I've worked for, it took a long time.

Garrett Allen: 2:47

Yeah, we always said that it's a living breathing thing Once you even get past, like beyond the hurdles of migrating from a legacy TMS which was very old they were on for anyone that understands it's basically on QuickBooks, on-premise and on visual basics, so very old TMS and migrating their entire history of finances into a new finance system, moving all their TMS, everybody's, into a new TMS. It took us, I think, a little under two years, I think, in total, to get us moved on to the new system. It was probably close to about a year, year and a half, and then after that it never ended. There's always a new feature, a new customer that needs a new thing. It was always ongoing.

Blythe Brumleve: 3:30

Do you miss it at all?

Garrett Allen: 3:33

Honestly, there's something to be said about being on that front line. I think it's really cool to be especially at a company like that, seeing as much success as they did and growing so rapidly, to just have that. It was the typical startup life you hear about, where it's like, hey, we can make a whole bunch of money, here's a great deal. We need to be able to do these things with our technology. We need to support this kind of a customer portal, this tracking. It was like all right, there's not a lot of overhead. It was a really small team. All right, let's do it in two, three weeks. Let's throw it together. It's really fun. I think after time it drains you. You can only keep that pace for so long, but it definitely was fun.

Blythe Brumleve: 4:15

I guess, speaking of the digital front line, not nearly as much as you have, but I have been diving into this world of AI for the better part of the last year, mostly on the marketing side of things. It's helped me so much in my day-to-day activities and how I think about business. It's also hurt me in some areas, just because I feel like I'm almost a little paralyzed to go after new projects as I'm like wow, that's definitely a candidate to be replaced by AI. I think that we're just talking to a lot of people.

Blythe Brumleve: 4:50

There's a ton of confusion around just what is AI from a very basic and, I guess, sort of for the audience listening, I want to frame this discussion in a few different ways. First, we're going to talk about the basics. Then we're going to get into a little bit of the use cases for today for using AI not only in freight, but maybe some use cases outside of it. Then we're going to get into a little bit of what we think the future is going to look like For folks listening. Just tell us what is AI, what is it? That phrase has been around for 70 years. How do you define AI?

Garrett Allen: 5:26

Yeah, I mean AI. I think could be broken down in a whole bunch of ways. The current wave of technology is really around generative AI. Historically, when we've talked about AI, it's really encompassed a whole lot of things a lot of machine learning, where it's a lot more traditional coding, of things that are a lot more. You put an A and you get out B, and it's always that way. Combinations of that can make it seem like intelligence. Basically, think about a video game AI.

Garrett Allen: 5:55

The new stuff is around generative AI. What that is is it's essentially taking some concepts that have actually been around, but as we're applying to these large language models like chatGPT, we basically have this huge tree of if, then statements, all these weights. What it does is it takes given a set of data, it tries to predict the next following set of data that follows the trend. That's the most simple way to break it down when we talk about large language models. With chatGPT, all it's really doing behind the scenes for chatGPT is it's getting a conversation between two people and it's saying, okay, now predict what user two would say. Then that's what chatGPT says. They frame it like it's talking to you, but in reality it's just trying to predict the next word. Basically Boiled down, even with the AI, video and image stuff it's very much similar where it's basically getting this.

Garrett Allen: 6:49

If you've heard of stable diffusion with image generation, the way that works is basically it gives you an image with scattered pixels of like just random, it looks like nothing and then ask the AI to just okay, we expect it to look more like a dog. Predict how those pixels would rearrange to look more like a dog. And it does that over and over and over again until it gives you your final result. Yeah, basically boiled down, it's really just predicting what it thinks the next word is going to be and there's just a lot of clever ways to use that to frame it that it can actually take actions and do things and seem like intelligence. Basically.

Blythe Brumleve: 7:22

That's a difference in how I was thinking about it, because, admittedly, I was thinking about it. Okay, this is data-based. You can't really have an imagination. It's based on the inputs that you're putting into the system and you're telling me that that's a little bit off base on how it's actually processed.

Garrett Allen: 7:41

Yeah, and it actually yeah, I mean, it could definitely have sort of like an imagination. It's hard to put it. Llms are actually designed quite similar to neurons and like a brain, where basically it goes to a point and it says okay, we're here. We know of like X, y data points based on the weights here it's called weights, basically like probability. We're going to expect, like the next thing, to go down this path right instead of the other one, and it basically just follows these paths until it gets to the end and gives you, like, what the next letter is or the next token.

Garrett Allen: 8:14

Basically, by taking, you know, tuning these models, it can take combinations of, like you know, paragraphs, thoughts that people have written down, images, whatever, and combine them and sort of create new things out of the input data that it's been trained on. So, depending on how you define, like, what a thought is or what you know an imagination, is it sort of can? I think in some ways it's an interesting like thing to think about. I think we really really start getting abstract on, like you know, talking about, does AI have thoughts? Or, like you know, is it sentient?

Blythe Brumleve: 8:50

I think there's a lot of arguments, both directions, honestly, yeah because there's a lot of people that say that they use like please and thank you when they're talking to chat GPT. Just in case you know, the robot remembers in the future of who was nice to it and who wasn't. Is that a thing?

Garrett Allen: 9:06

Yeah, yeah, I mean, I don't think it's going to actually remember who is who is nice to it in the future, if it ever gets there, to be honest. But yeah, it's funny. I think about, like what the Google engineer from what was that three or four years ago that was making the rounds about, you know, when Google was first crossing into this, and the R&D side you're saying, you know, they've created a sentient AI and it knows a member, and it was like making the rounds like this crazy engineer, oh my gosh. And and then chat GPT came out and it was like, oh wow, he saw this, you know, and you can see where he was coming from, when nobody else had seen this yet.

Blythe Brumleve: 9:37

So yeah, it's, it's. I think it's super interesting because for so long, especially in our industry logistics it's been one of those things where a lot of the blue collar work has been theorized is that's going to be the first thing that's replaced with new technology you know, warehouse robotics and autonomous trucks and things like that but now, in such a short amount of time, it feels like the complete opposite, where the white collar workers are the ones that are probably going to be replaced sooner than anybody that's working in blue collar work. Do you see that as kind of the same thing?

Garrett Allen: 10:14

Yeah, I mean definitely white collar jobs really digital jobs are. You know ultimately the things that most at risk which really plays into mostly white collar jobs. But you know places, especially where we have text communications, that have historically been sort of difficult. It's kind of gray area right. We had natural language processing in the past where we could sort of see where somebody was asking you what they were trying to do, but it was really hard, Like there was too much over, had didn't make sense to automate it, Just put a person in there. Now, with these large language models, it's so much more approachable. It's like we can just automate this. It's already digital, we already have the communications. The LM can understand this and say what we're trying to do. It really is so much easier.

Garrett Allen: 10:53

And the other hard thing with blue collar work is, you know, obviously, like physical world, you know going, turning a wrench, picking a box up.

Garrett Allen: 10:59

You start getting into robotics and one of the things with robotics is you know being able to like have a machine balance something and being able to be quick and response times and things like that.

Garrett Allen: 11:10

I think we're going to start seeing a transition into more of that stuff, as some of these companies like GROC, if you've heard of them. They're really making these LLMs generate really fast, right, they can actually generate a response faster than like you or I. Could think about what the response is, which is completely different than the historical experience has been with chat, GPT. Well, when we start getting into the scenario of an LLM can now generate a thought, like you know, five times, 10 times faster than a human can. Well, now we can start applying that to robotics. Right Now, a robot can actually play baseball or something, do something that requires more finesse because it can generate thoughts at a speed quick enough to actually do that. So I think, as we see like the generation speed for like tokens and these LLMs and not just LLMs, but other AI kind of models increasing, we'll see more of the shift into some of that more blue color work.

Blythe Brumleve: 12:00

Are you in freight sales with a book of business looking for a new home? Or perhaps you're a freight agent in need of a better partnership? These are the kinds of conversations we're exploring in our podcast interview series called the freight agent trenches, sponsored by SPI logistics. Now I can tell you all day that SPI is one of the most successful logistics firms in North America, who helps their agents with back office operations, such as admin, finance, it and sales. But I would much rather you hear it directly from SPI's freight agents themselves, and what better way to do that than by listening to the experienced freight agents tell their stories behind the how and the why they joined SPI? Hit the freight agent link in our show notes to listen to these conversations, or, if you're ready to make the jump, visit SPI3PLcom. And so you mentioned LLMs a couple times, which I've been referring to them as LLMs, and so that's that's. You sound like that. That's that. That should be the slang that I'm using.

Garrett Allen: 13:00

No, it's LLM. Llms, okay, marge language model. Yeah, sorry I'm kind of slurring that, but Okay, I just wanted to.

Blythe Brumleve: 13:07

I was like, oh wow, so that's. I should be glad that I asked no, you're good, because I was totally like making a mental note Okay, don't call it LLMs anymore, just call it LLMs.

Garrett Allen: 13:15

No, no, no, I'm just, I'm just not pronouncing it all the way through, I guess.

Blythe Brumleve: 13:19

Okay, so let's speaking of those you know it feels like you know, because I guess, to sort of lay out the landscape, we have chat, jpt, we have Anthropics Clawed or not Anthropics, but you know, anthropics Clawed, correct, but they just launched a new one. Then we have Google's Gemini, which used to be barred, and we have Perplexity, and we have all of these different LLMs. Which ones are the ones that you're using the most today?

Garrett Allen: 13:49

Yeah, so that's a great question. So we actually a LoadP artner behind the scenes Is sort of like a series of AIs, it's actually not just one. Some of the ones we use are we actually use a lot of like Mistral Models, if you're familiar with that. Those are free and open source, so we actually don't host them ourselves. We pay for hosting, obviously, but they allow us to fine tune. We also use some of gpt. So we use gpt 3, 5 turbo and gpt 4 and then we've been experimenting with Clawed 3 opus and actually the sonnet model is quite good. So we use.

Garrett Allen: 14:25

This is kind of something not to completely real, but basically we had LoadP artner. We are a strong belief that you know a single AI model is not actually Like the most effective way to give like the best experience, and the best way to do it is actually to have like multiple AIs, sort of working collaboration, to give you a single AI experience where you have different like experts behind the scenes, sort of Each kind of focusing on their own area of expertise and doing what they need and then bubbling that back up as like a single response.

Blythe Brumleve: 14:54

It almost feels like that. This is for, I guess, millennials who grew up in a certain time, such as myself. Uh, you know, there there was a big sort of console war, right. So it was nintendo versus xbox versus playstation, and it was really like the battle of the exclusives. And that's who, you know, I would, you know, preach all day that that sony has won this race and run won this war. Um, but in that, I guess, sort of competitive lens, is that it kind of like a maybe a good or bad comparison Compared to the llms that are going on today, that they have different exclusives, they have different expertise? Is that the way we should sort of be thinking about these things?

Garrett Allen: 15:35

I think it's kind of it's almost like too early to tell Um, and I say that because I think you know One of the big things with these different providers is that most of them are are trending towards what open ai has already been doing. Um, you know, especially around things like native function calling with new models that come out Um and like just the scheme of like how you interact with the model, most most of them have been just being like okay, we're just gonna do a chat, gbt kind of does, because that's what that's kind of the standard um. I think what we'll see long term is that we'll end up with If something like what you're describing, where it's like there's going to be models that are just better for certain tasks versus other ones and that's going to be like more of the separation, like the exclusivity. Right, you're saying this one can generate better images, so if you want to do images, use this one.

Garrett Allen: 16:22

Um. Right now we're just like not to the baseline yet. I think like everything is still like advancing so rapidly. Something that could be like not as good at images could be like the best at images tomorrow, and so it's like too early to like make that distinction. I think, um, but definitely in the future. The way I see it is that there will be a lot more models and kind of more what you're saying. Well, you'll have models that are actually better for certain tasks, and People will basically lean into those based on what they need.

Blythe Brumleve: 16:52

That that's super interesting because I I heard from you know someone recently that they they pretty much said that we're going to reach a level where All of them are pretty much going to have the capabilities of a gpt4, right, because this is what google gemini is chasing, this is what um anthropic is chasing with their cloud model. I think it's opus is now what? What they're calling it, um, and so they're all going to reach the same level as Chatchapiti, and then we're all, we're just going to reach parity, and then it's going to be a situation where you take your data and combine it with one of these models and then you'll be able to to work off of, I guess, that combined intelligence. But now what you're saying is that maybe you could still use your own data sets, but you're going to pick certain ones that are going to be Better at certain tasks.

Garrett Allen: 17:39

Yeah, and and think about it this way you know, as far as when we reach parity, and like you know, like you mentioned the opus model, clout 3 opus, and it's Pretty good it's. I think it's better than gpt4 from my use cases I've used it in Um but I think that goalposts will just keep moving, like we'll never truly be at parity, because there's always going to be. This one thinks a lot further ahead and can give you even more detailed explanations, uh, but this one's faster and this one's better at, you know, writing code for you and that will always like we'll never like hit a point where it's like, okay, it's good enough, it's going to just always keep getting better. And so once we hit the point where um, the general, like conversational aspect of it is like good enough, I think we'll start seeing those models become one more specialized, where you kind of get more of that exclusivity that you mentioned Um, and there's also different use cases too, right?

Garrett Allen: 18:31

So beyond just like the obvious of like this one's more detailed or this one's better writing code, you also have other tradeoffs like how fast does this one generate? Is it quicker, is it slower? There's different use cases based on what your business is trying to do about. Which one would you want? This one is writing emails. I don't care if it's slow, I just wanted to write the best possible email Versus this one's a chat bot. I want it to be fast and respond instantly. So, um, I think a lot of those different aspects will ultimately distinguish them in the future.

Blythe Brumleve: 19:01

Yeah, it sounds like this is a good place to, I guess, to get into the Almost like the anxiety around some of these tools. Uh, you know, before we get into some of the use cases, I think there's I can only speak for myself and you know just the the folks that I hear from but it feels like there's Either just a complete ignorance, like I don't want to know about it, or there's a fear there that this is going to replace my job. This is, you know, and in In my own use cases I'm seeing it as okay. There's going to be a lot of services that I offer now that's going to be replaced by AI. So how do I even plan for my business moving forward If all of this is taking place right now? How do you sort of? I guess that you know. But before we talk about LoadParnter because I imagine those that's going to come up in conversations where people are almost scared to Adopt it because it's going to replace them- yeah, I mean, you know, AI really is kind of a a new frontier.

Garrett Allen: 20:00

I think a bit on technology. So I think, with something so new, like it's definitely understandable people be afraid of you know their job getting replaced, or, like, what is really the implication of us using this? Um, the way I see it is, you know, ultimately it's going to really help people do more with less. Um, I think you know, figuring at how you can basically use it to provide an even better service is, like really the best way to approach it. Um, I think approaching it with the intention of like how can I provide the same service for less? Or like put less effort into it, is going to end up with, you know, people are going to do that and it's going to. It's going to happen. It's already happening, um, but it's obvious. Everybody knows. You know, if you go on linkedin, for example, and you look at through all those posts, you can immediately tell who clicked to the rewrite this with AI button, right. Um, people are always going to pay more and appreciate more like the, like Genuine service. I think that's always been kind of true, uh, and it'll stay true. So, places where you can still provide like a really good service, but just Like, improve it even more with AI. Um, that's, those people are going to be the most successful.

Garrett Allen: 21:09

Uh, and the people that are just trying to, you know, outright replace it, make it cheaper, you know, just pump out content, things like that. People are going to know it's going to be, you know, the cheap knockoff. Everybody's going to understand that and they're going to, basically, they're going to be considered the lower value. People are going to pay less. They're not going to pay as much attention to it.

Garrett Allen: 21:26

Um, so you know, I think, looking at it that way, it, whatever the business may be, I think it really kind of like helps, go, okay, how do I want to pull in AI? It's like, okay, well, I probably don't want to just have this write my articles for me, but maybe I can use it to say, like, get pointers on an article I did write and then maybe tweak some stuff from an article that I've already written. So you, oh yeah, I didn't think about this little thing, or like the way this word could be improved a little bit. Or, you know, there's a lot of things around like taking quicker notes from a video or, you know, getting transcriptions. There's a lot of advantages there. You don't nobody cares if you spent you know two hours getting the transcription written down for a video, versus if the AI generated it for you, right? There's not really value there, so use AI, right? I think it's just weighing those back and forth as you're going through. Everything is just really important.

Blythe Brumleve: 22:17

Yeah, transcripts have. You know, having them auto-generated by AI and just like meeting note-taking apps have the chain have been probably the largest impact on, like, my career over the last year, year and a half. I remember having to just hand transcribe, and when I say hand transcribe, I would hit play on a video, type out what they said pause, hit play on a video, type out what they said pause, and it was just repeated over and over and over again. It took so much time but now I can get that stuff out so much faster because of these different transcription tools. So that is, you know, I guess, a good segue into going to some of the use cases, because you started building or you've been building a load partner and kind of tell us about you know, I guess, sort of the backstory on that, what was the catalyst of starting up Load Partner and what it does.

Garrett Allen: 23:05

Yeah, so basically with Load Partner, one of the big things for us is, you know, we feel like a lot of technology companies, especially in the freight tech space, are sort of trying to do too much. You know, there's a lot of apps out there. There's a lot of like hey, you got a downloader app or you need to use our tracking software or whatever, forcing sort of drivers, customers, whoever, to sort of come to them, and we've seen that there's a real need for providing some more of that service and automation through conventional methods of like SMS, phone calls, emails, you know, written communications. Drivers have smartphones in that right so they can open a link on their phone. So basically what we're doing in Load Partner is we built a 24 seven AI load coordinator and what it does is, if you wanted to start using Load Partner, you'd go through our onboarding process.

Garrett Allen: 23:55

You would fill out some things about like how you run your freight, when do you make check calls, how often do you want location updates from the driver, things like that and then our load coordinator with Load Partner would basically, based on those preferences, make calls out to the driver, send text messages. We have location links we can send out. They can open to get a location and basically just help you manage, you know your freight that's on the road and the cool thing is we're doing it all through you know SMS, you know links to a website, emails, so we don't have like an app they've got a download or new technology that drivers can understand. It's all through, you know, coming to them. So ultimately, we're just making communication between brokers and drivers or whoever that's managing freight, a lot better and a lot simpler, and leading those teams focus on other things.

Blythe Brumleve: 24:43

And so for a lot, is it like a standalone app, or is it like an integration into their current platforms? What does it sort of look like from, I guess, a user standpoint?

Garrett Allen: 24:53

Yeah, so actually it's both of those. So we've built it so that, if you want to use it standalone, we have a whole web UI. You can go in. You can import your load directly through there. We have a number of import options for some popular TMSs as well, as you know, csv email. We actually can even take images, like if you have like a ratecon or something, and import your information straight off of that, and then you can just run it all straight through load partner and get your text saying that something's going on or whatever.

Garrett Allen: 25:19

Or if you want to do integrations, we also have inbound and outbound APIs so you can, you know, shoot us over your load information through an integration. We can send you the events back directly into your system and in that case we even have a drop-in chat so you can basically drop in a load partner chat directly into your TMS if you want, or just let your user pop it up on the side and basically talk directly to our AI and kind of get AI in your TMS for free. So yeah, kind of either of those work.

Blythe Brumleve: 25:46

Oh, that's interesting because it can just sit on top of, because I imagine it would probably be such a, you know, a big hurdle to overcome if you were trying to convince somebody to move away from their existing, like TMS, in order to use your platform. But I think that sounds really cool, that it actually sits separately from it and you can get AI benefits without actually having to you know, trust that your existing technology provider is going to create it the right way.

Garrett Allen: 26:11

Right. Yeah, like I said earlier, I already built a TMS once and I'm not trying to do that again. So please, just, you know, integrate so that we can just use it that way. That's definitely the preference. And yeah, we can. We've referred to it as sort of like a TMS light because we have the basics in there. You know you can manage a load appointment information the bare minimum you need but we don't have anything around like billing, for example. You can't put billing information into a load partner and things like that.

Blythe Brumleve: 26:37

Brokering success demands a battle ready strategy. Thai TMS equips freight brokers with the ultimate battle station for conquering a tough market. With Thai, brokers gain access to a comprehensive platform where rate intelligence and quote history converge on a single screen. It's not just a page, it's a strategic command center designed to help brokers win. Thai equips your team with all of the data they need to negotiate with confidence and allows them to communicate directly with carriers and customers from a simple control base. Revolutionize the way your brokers perform by giving them a competitive advantage with Thai TMS. For more info, go to tai-softwarecom backslash battle stations and we also have a link for you in the show notes to sign up for a demo I was listening to for this interview.

Blythe Brumleve: 27:25

I was listening to your interview with on PDQ America Trucking for Millennials podcast, which is great episode, by the way. I'll link to it in the show notes and they were talking about different, I guess, sort of etiquette around, like check calls and how you know drivers will be pulled over for, you know, in the middle of it and then they get a phone call in the middle of the night asking them where they are or they're getting these check call requests, you know, at seven in the morning and it's bringing up, like all of these new questions that we've. I don't think that we've ever really thought about asking around etiquette in this industry and I thought that was a really great conversation that they were bringing up with you. So I guess sort of how do you think about etiquette when you're building load partner in the features that are really beneficial to a lot of the users?

Garrett Allen: 28:11

Yeah, so it's actually a big one for us. You know, I kind of touched on it earlier about sort of coming to the drivers instead of forcing them into another app, right, just SMS, email, the usual. I think there's a lot of benefit there. And then we also have some stuff around. You know, driver preferences that we have planned for later this year not to get too far into it because it is for later this year but essentially, you know, we want to give a lot more control to the driver.

Garrett Allen: 28:38

If we have, you know, running your load and it has its preferences and knows where your location is, we can infer your ETAs based on certain things.

Garrett Allen: 28:46

You know, if you have, for example, quiet hours where we know you're going to be pulled over resting, we won't call you, we won't text you. You know those are things we're going to surface to the driver to be able to set so that they won't get that midnight call because we can see, hey, their last location was within an acceptable amount of time. Based on that, we know their ETA is probably roughly right. We'll just call them in a couple of hours when they wake up, and so you know again, we're not just helping whoever's like managing the freight necessary. We're really trying to also give tools to the drivers to make it that you know they like running loads with because it makes it nicer for them. They can equally, you know text into their load partner number and immediately get responses about you know appointment time information or pickup numbers, things like that. You know Load partners load coordinators available 24-7 for them just as much as for like a broker.

Blythe Brumleve: 29:35

Oh, interesting. And so who's doing, I guess the? Is anyone hitting send on these text messages? Is anyone you know picking up the phone and making these calls, or is that all sort of automated through the software?

Garrett Allen: 29:47

Yeah, so it's dependent on your preferences it can be fully automated. So we have, like I said, sms and then we also have AI voice calls so you know load partner can actually like call a driver and ask you know what their ETA is or what their temperature is. If you want to make the car yourself, you can, and then SMS. Same thing. We have like the human in the loop setting so that you know a user can say don't send anything to the driver unless I like review it and say okay, first. Same thing for other actions to make it that you can like kind of lock actions of the AI behind an approval process, sort of. If you aren't quite sure, because we know that, you know jumping into full AI automation for none, it can be a bit daunting. So you know, trying to put the tools as much as we can into the user's hands.

Blythe Brumleve: 30:28

Yeah, for sure, and you know I've heard you talk about sort of the analogy of, you know, for data management, of like garbage in, garbage out. How do you make sure that your data isn't garbage, though?

Garrett Allen: 30:41

Yeah, so I have a lot of thoughts on this topic. The floor is yours. Yeah, so I mean, you know, honestly, one of the cool things with you know all these models is that they are pre-trained. You know, if you go to chat GPT and you're going to talk to chat GPT, you don't need to preface it by telling it about who you are and like what kind of stuff you want to talk about. You can just ask it a question and it generates something back. They're all pre-trained to a certain point.

Garrett Allen: 31:08

So for a load partner, for example, our AIs we've built all these freight tools under this freight platform and then we kind of threw AI on it to connect the dots Right. That AI isn't just a chat GPT wrapper where it's just hey, chat GPT, can you tell us, can you write a text message up for this driver? We have, like I mentioned earlier, a number of different models working in collaboration to give you that experience and they're all trained already, not on your data, on other data like how to run freight. So they understand what's a load, what's a pickup, what's a reefer. It understands those concepts. So we don't need all your data to train a model Like. We've already done that part. All we need to know is hey, what's the load number, what's the driver's number? When are they picking up? When are they delivering? Just those bare minimums that are pretty low hanging fruit for most organizations to provide. We don't need the last 10 years of your freight, we just need to know what's on the road today. So a lot of these companies. I think data integrity is still important. Every company should still keep that in mind. There's value there.

Garrett Allen: 32:15

But to actually get started using AI, quality of your data is actually not nearly as important as with traditional machine learning Back in the day. If you're building a machine learning model to, I don't know, calculate a rate, there's a lot of rate matching stuff with machine learning. That data has to be really cleaned up. Like most of the work that goes into that is just building the pipeline to you know, feed that data into the model and train it and like have it understand your rates and predict stuff. You don't have to do any of that with all the new AI stuff.

Garrett Allen: 32:47

It's already trained. It knows what freight is and knows what to do. It just needs to know the details of what we're doing right now. So AI is actually a lot easier to do than it is like a traditional machine learning project. I would say so anybody that's interested in getting into AI stuff, you know, I would say, definitely keep that in mind. If you've done ML stuff in the past and you're like, wow, that was really hard, I don't want to do that again, or like that seems unapproachable to us, that doesn't necessarily rule out that it would be possible to do AI, because it's a lot easier.

Blythe Brumleve: 33:19

How? I guess this is going to sound like a dumb question, but how did you tell AI like, how did you break it down from, I guess, from like a function standpoint of like what you did? How did you tell it what is a load, you know, what is freight Like, what is logistics Like? Is it as simple as just writing it out in a text document and teaching in LLM?

Garrett Allen: 33:42

So one of the cool things with AI this is a great question, by the way so one of the cool things with AI is it's very new. If there's, like a lot of you know, try it out and see what works, kind of an approach.

Blythe Brumleve: 33:54

But what's that first step? You know, like, what do you do, what do you try? Like, what do you do?

Garrett Allen: 33:58

Yeah, so I've. We. Basically we built like this AI framework that helps us to really quickly iterate on giving AI sort of like tools and helping it understand like what it can and can't do, kind of breaking that down. It's ultimately, at the end of the day, everything is a combination of fine tuning, which is where you can basically take a model, give it examples of like here's what you got, here's what you said back, and you could make up. When it said back, you could just say this is what you should have said back, and you give it all these examples and you fine tune it and what that does.

Garrett Allen: 34:28

Like I mentioned earlier, you know, these, these LLMs are just these big trees of weights and if you fine tune it, those weights actually get modified so that the results, when you give it something more closely matched to the data of examples you've given it. So we've done that in some cases where we've basically said like, hey, if somebody says they want the latest location for a load, you should send a location link out to the driver, right, and we fine tune our models to like know all this. So then later when it sees the burger says, hey, I need a where's the, where's this load at he goes. Oh, I know I was already fine tuned for this. Basically I'll send that link out and so kind of that same approach with just freight and like you know how to. What's a reefer mean? You know what kind of trailers need what kinds of things, what is a temperature? You don't need a temperature on certain trailer types, things like that.

Blythe Brumleve: 35:17

Yeah, it's almost like and I feel like I'm saying it's almost like you know, drink every time I say that phrase. But it's going to be. It's going to happen a lot in this conversation and for a lot of marketers. I think that it's tough for them to get used to this industry, that the slang, the terminology, but it kind of sounds like you've already programmed a lot of that slang and terminology into the machine and so now it's already up to speed and up to date on it, where now it can see it can speak the lingo that everybody else is speaking.

Garrett Allen: 35:51

Yeah, yeah, that's. I mean that's a big part of it. You know, with with communications like text especially, people can be really short, use a lot of slang, and so it's definitely important for a low partner to understand that and know what people are talking about. Basically.

Blythe Brumleve: 36:05

Are you seeing, especially with the? You're one of the first companies that I've seen, you know, deploy AI solutions in the freight world. Do you see any other companies that are doing AI right in this space? Are they existing companies or are they new companies, or maybe a combination of the two?

Garrett Allen: 36:23

Yeah, I mean it's hard to say you know what is right. I kind of you know. It is like the new frontier, like I said, everybody's trying to figure it out, I think, right now, which is which is cool.

Garrett Allen: 36:33

I appreciate especially all the different approaches we are seeing, I would say, the way we're approaching AI specifically. I have really not seen other companies doing it the way that we are doing it specifically, but there are definitely a lot, of, a lot of people out there. I know Parade does a lot of stuff around, like email parsing, for example, that I think that you know they're using generative AI in a way that seems, you know, hey, this is great, bringing users value. They're not trying to, you know, give them the world, which I think is really important. When you're, you know, building with AI, you've got to be intentional about it, not just be like, hey, this is AI, will drop it in and the problem is solved. So, yeah, I'd say that's.

Garrett Allen: 37:10

That's one of the biggest things with looking at, like, how people are using AI is, you know, what are they trying to actually solve? Do they need AI? Even within load partner, we have a lot of, you know, like I said, freight tooling we've built. That's not AI. It's just sending a text message, sending a location link. That stuff's not AI. It's tools are AI use. But you know, it's us being intentional about the problem we're trying to solve, and so, yeah, I think that, looking at other companies, I say it's hard to say who's really doing it right. I think there's a couple out there. They're they're making it, doing interesting things on it, though, for sure, and I definitely like watching all the different different attempts at plugging AI into existing processes, for sure.

Blythe Brumleve: 37:49

Now I love that you brought up Parade, because I actually had their example in my show notes, because I saw a post from Parade, because they have a new product called Capacity Code Driver and it's inside the McLeod TMS. That marks. You know I'm quoting their CEO here, but he says marks the pivotal moment in freight brokerage, as 50 loads per day per person will quickly become the new normal. Now, when I, when I worked at a freight brokerage, 50 loads in a day, like you, were the rock star of the month, this sounds crazy that that would be the new norm for 50 loads per day per month. But if you're automating a lot of these different tasks, then it makes sense that that that person is going to be freed up to be able to do these other things. Is that a safe assumption?

Garrett Allen: 38:35

Yeah, definitely yeah. And like I said earlier, you know, I think that's a great example of you know, if you can automate stuff, that's like hey, this, this is just like very transactional, we get this and we want to give this back to them, and like there's not a plus one service to offer here, it's just get this done.

Blythe Brumleve: 38:50

You know that's a great example of hey, now we can do 50 of those and yeah, Well, there's this other and, as you were just talking, it brought up this other point that I heard someone make that if and just theoretically they're saying, if we're going to be freed up, if AI is going to help us free up this time from these mundane tasks, maybe those mundane tasks are important because it allows us the more bandwidth to handle the larger, I guess, creative tasks better, because now if we have more time to be more creative, then that would lead to quicker burnout is kind of what this creator was making. The argument is that I can't think like that all the time, like sometimes the mundane task helped me to avoid the big thinking and then I can get back to it later. Do you think there's an argument to be said that that it, you know it almost might lead, you know, to, I guess, burnout much quicker if we're just focusing on these bigger tasks all the time?

Garrett Allen: 39:52

Maybe. It's definitely an interesting thought. I think what we'll end up seeing if I, just if I had to guess is just a lot of those mundane tasks will get automated through AI. But some of the more complicated tasks, like you know, these creative ones that are like, hey, this is a I got to spend a day really thinking how I'm going to do this they become more mundane because we have better tools for them.

Garrett Allen: 40:13

So like, even though we have to now, like you know, hey, ai can't solve this all the way. It could still maybe solve it 80%, and you're just doing that last 20%, and so stuff that you know, just because AI can't do it doesn't mean that it's necessarily going to be a really hard task. It might be that, just like that final bit needs to get, you know, finessed by a person actually looking at the response or, you know, filling out the final details. So I think there's still going to be, you know, mundane work to do, even with AI around, and that goalpost will just continue to move. You know the tools will get better, we'll solve harder problems, the harder problems will get easier, and like the cycle, or repeat, I think.

Blythe Brumleve: 40:50

If someone is listening to this interview and they're thinking about oh what? The load partner sounds like a good solution for me. How should they start preparing their company, their business, maybe even just sort of the mindset within the company? How should they start preparing themselves internally to be able to start hiring external sources like yourself?

Garrett Allen: 41:13

Yeah. So you know, good, API is definitely a big one. You know understanding sort of like how you get data in and out of your system. Again, you know, we don't need you don't necessarily need the whole world of like your 10 years of history, but just understanding how an integration like that would work. The other thing is, you know, load partner is really like I kind of explained earlier. We take your preferences on how you like to run freight and then we basically try to mimic that within load partner and run it the same way. So understanding what that looks like for your company well is also important.

Garrett Allen: 41:46

If you are like we don't really know when we do check calls or why then it's going to, we're going to have a hard time replicating that.

Garrett Allen: 41:55

And then, yeah, another thing I kind of touched on was just, you know, being intentional about the problem you're trying to solve.

Garrett Allen: 41:59

You know if, if you're going to bring AI in load partner or otherwise, you know, really think about, like, what problem you're solving.

Garrett Allen: 42:06

If you're just pulling an AI, to pull an AI, you're probably not going to get a lot of value out of it.

Garrett Allen: 42:11

It's really good to kind of outline like what you're trying to solve specifically, you know, hey, we need to scale up our, our load count per rep or whatever, and we we don't have like enough to like whatever. We need to bring in, basically, ai to like assist this person who's going to have a lot of loads this month, okay, we can pull load partner in. But if you're just like, hey, it would be fun to have a chat bot they can answer questions about our knowledge base, it's like, well, like what, do you really have an issue there that you're going to solve with a chatbot, right? So it's just, it's really important to think about, like, the specific problems you're trying to solve with AI, if you're going to bring it in, because there are scenarios where, again, back to load partners, you know some of our stuff our location links and sending texts is traditional programming. That's not AI and there's potential that even non AI solutions will solve the problem that you have.

Blythe Brumleve: 43:01

So, yeah, it's almost like a good process. Audit is in order for a lot of companies. In order to figure out that maybe is that a good starting place for them is just audit your processes. Do you even have them documented, if not, document them and then see where AI plays a role.

Garrett Allen: 43:18

Yeah, yeah, I mean really. You know the way I see it is, AI can is going to be able to like fill in for, like, if you're going to onboard a new person into your team, you know how would you explain to them, like how you want to run a load? If you can't do that, then you also can't explain that to a system that is going to try to replicate it. Right, Same thing with a lot of AI. Most of the solutions need a similar like guidance, and so if you can't explain what you're trying to do, basically the AI can't magically like make the decision of what you're trying to do for you.

Blythe Brumleve: 43:48

What does sort of the pricing model look like for load partner? Is it per user, per, I guess, token, or maybe a combination of the two?

Garrett Allen: 43:56

Yeah, it's a good question. We're still working with our early design partners to sort of like finalize that, but mostly the way we look at it is per load. I think that's how a lot of freight brokerages run. Their costs is really cost per load. So we have a couple different features that may be priced differently based on the needs that we have. Like I mentioned, we have AI voice calls. If you don't need to have voice calls, that changes the pricing for us too. So there's a lot to work out there, basically, but cost per load is definitely how we're approaching it.

Blythe Brumleve: 44:29

Interesting and you're still on a waitlist, correct? You talked about your design partners, and those are the companies that you're working with in order, I guess, maybe to come out of beta later this year, or what does that time frame look like if you have that in mind?

Garrett Allen: 44:42

Yeah, yeah. So we still have a waitlist right now. We're planning to come. We're definitely going to be out of beta. We'll be open this year, for sure. I would expect sometime in Q2 that we would have a sign up available. Or, if you were interested, you can sign up and just use it on our own. Like I mentioned, you can use it outside of your system. So what we're really going for is somebody can go up, sign up for load partner and literally just start using it without even like having to go through a sales cycle or anything. Just go sign up, put your information in, get a subscription and then just start using it. We have inbound and outbound APIs that are completely self-server if you want. So we're shooting for that later this year I would say sometime in Q2, the gates will be open a little bit better.

Blythe Brumleve: 45:22

Oh, that's cool. Yeah, that's exciting to watch, and I love that model of just letting people try it and give it a test drive before having to talk to anyone. I know that for a lot of especially folks that are going to adopt, or more willing to adopt, this kind of technology, they maybe don't want to hop on a sales call. They'd rather just give it a test drive themselves, and I'm sure maybe you're seeing a lot of that same thing.

Garrett Allen: 45:43

Yeah, absolutely yeah. We have a lot of confidence in the stuff we're building, so we definitely want to just get it out there and have people experience sort of that magical like wow, this will just see that driver's late and send me a text and see that in action. We definitely want to just give that experience to people so they can understand.

Blythe Brumleve: 46:02

And what does your team structure look like? So, your co-founders I imagine that there's another founder that you're working with Are you? You know they have a team, maybe offshore or internally. What does that look like for you?

Garrett Allen: 46:13

Yeah, so right now we have a pretty small team it's me and my co-founder working on this, and then we have someone helping with marketing and then someone helping with customer support. Basically, you know, helping like the on-call kind of things come up.

Blythe Brumleve: 46:24

Wait. So how are you handling your customer service in marketing with AI? I'm curious.

Garrett Allen: 46:31

So customer service is, you know when, even with you know AI stuff comes up. So we still have like a customer service to help with if somebody has questions about how to use the system or something's not working correctly. And then with marketing, yeah, we actually have someone. Actually you might know Little Miami Marketing, chase Osborne, if you know him. Yeah, he's been helping us so he's doing a fantastic job. Oh, awesome. So you know all of our posts on Twitter and our funny memes. That's all him.

Blythe Brumleve: 47:03

Oh, that's great because it is great marketing and that was one of the big reasons why I wanted to have you on because it's called Load Partner. And then we kind of joked around before we started hitting Record that you kind of hit on the, I guess, the capitalization of the cowboy market because you know Beyoncé's new country song and all that. So it's a really cute branding and I love how you know he really dives into our braces, the meme culture that has taken over for eight.

Garrett Allen: 47:29

Yeah, yeah, it's so good it's, it's. We've all enjoyed it. It's been a kick for us too.

Blythe Brumleve: 47:35

That's awesome. Shout out to Chase Good job, because the marketing has definitely been noticeable. So I know that's probably, hopefully that's a good compliment to him. It definitely is, at least from my side of things. Now let's switch gears a little bit and talk about the future, because I want to know what is all hype in AI and what do you think could be around for the long haul. Are we going to be still be using LLMs in a few years, or is that really difficult to tell?

Garrett Allen: 48:03

That's a good question. You know, I don't know We'll be using some form of something like an LLM probably. I think there could be like levels on top of that where you know kind of what I was saying, where we have like multiple AIs working collaboratively to like give you the one AI experience. I think some kind of architecture around that might become more standard. So it's not just one LLM, it's like many somehow.

Garrett Allen: 48:28

I do think that, like I mentioned earlier, the generation speeds with companies like Grock are going to like really make a difference because you know, right now, you know, you ask Chad GBD something. You have to sit there and watch it like type its response out Basically right, it's pretty slow With Grock. We're kind of crossing into that, you know, and AI can think faster than a person can think, and that just really opens up a lot of opportunities for real-time applications like robotics or even just better chat experiences. When you can ask an AI a question and it can, you know, go search five websites, do all this research and give you a response back faster than you can even like think about what you asked it. That's going to be crazy right, like that's a whole other world. So I think we'll. As we get closer and closer to that, the AI is just going to become more and more. You know, it's just going to be like mind blowing, I think. I think we basically haven't even seen the beginnings of like how crazy the technology can really get.

Blythe Brumleve: 49:22

And when you're talking about Grock, you're not talking about, like, twitter's Grock, you're talking about the other Grock that is yeah, yeah, I should have clarified.

Garrett Allen: 49:31

That's a good point. Yeah, so there's a company called Grock. It's with a Q, g-r-o-q, and essentially the short is they are, I guess I don't know what. I guess the right way to put it is basically they have their own chips that they manufacture, where they are designed, I guess, that are specifically designed. So if you're familiar with like a computer, you've got like your processor and then you have like a graphics card, right, and they're specialized. That it graphics is for, obviously, your graphics.

Garrett Allen: 49:59

Well, they've created language processing units which basically are built to run LLMs and so so essentially they can run, they can generate responses from LLM's, like you know, 10 times faster than the competition. So I think the numbers are don't quote me on this, but I think it's like chat. Gbt is like 40 tokens per second. So a token is like roughly three letters and then Grock with Mistral just hit, I think, or what I think it was Jimma, maybe, or Mistral they like just hit like 400 or 500 tokens a second. So they can you can, you know, ask it to like write you a story and you know which at you see, like writing the paragraphs With Grock it's like a recipe and each like a web page loads.

Garrett Allen: 50:42

It's just like. There it is the whole thing. Which is crazy because then if you say, hey, you, you build something like a react to model or something where you have like an AI that's actually like reasoning and like acting and reasoning and like looping on itself Well, now you can ask it to do something and just like that, boom, it can have done 10 things instantly, which is crazy.

Blythe Brumleve: 51:03

So why aren't you know, like a chat to BT, why isn't it using the chip specifically designed for, for LLMs?

Garrett Allen: 51:12

I think there's just they're just new, so I know so I know previously they were using Chips from Nvidia, which is why Nvidia's you know stock is crazy right now. That's all the AI stuff. I have a feeling they have something probably up their sleeve. They haven't mentioned a partnership with Grock yet. I know that Grock's been like prodding at them on social media, just been fun, but yeah, I don't know. I'm sure that they have something that they're working on because it's been, you know, making such so many waves and it's really cool, so I'm sure something's coming. I think it's just a matter of again. This is such a new frontier and there's so much new here. It just they just haven't done it yet. Basically, it's just brand new.

Blythe Brumleve: 51:51

So we've talked about. You know we've covered a lot of ground. So far it's it's been on the the chip side of things, it's been on the large language model side of things, but then there's also like the open versus closed side of things. Do you see enterprises building their own grocks internally or their own chat GPT's in internally, or will they Really use some of these tools in the best use case scenario, like what we were talking about earlier with where maybe one is better for images and one is better for documents? Is that the kind of the I guess, a hybrid future that you see?

Garrett Allen: 52:26

I Think it's kind of hard to tell. I I do think that you know we're gonna see a lot of AI providers pop up that are specialized in industries. You know, like a load partner, for example, that we're gonna sort of this AI provider that specializes in like running freight on the road. I think there's gonna be a lot more of those, just in different industries, and companies are gonna kind of like Move towards like hey, let's just find a partner that really is keeping up with AI. It's really fast moving. Let's just pull in their systems and not build it ourselves. But I definitely think there will be companies that run their own. I think it'll be a combination of using general purpose models like chat, gpt, to just solve like a simple use case, and then you'll have more advanced ones where people are. These companies are actually trying to build their own internally and Train a model from scratch to do some you know specific use case.

Garrett Allen: 53:13

I think we'll see a little bit of everything, but I think the most popular will be, you know, finding a good AI partner for your business.

Blythe Brumleve: 53:19

It will be like the thing that makes the most sense like an internal AI ops team is kind of for other companies that have come to me and asked me like well, you know, how do we even start implementing this? And I was like, well, just start an internal Assign, either one person as an AI ops manager to figure out the different use cases and stay on top of the news, or maybe it's an internal team you know these subject matter experts from each department that are getting together regularly and convening On how they can start utilizing these tools it. Do you think that that's a good pathway for for businesses to use as they think about AI adoption?

Garrett Allen: 53:56

Yeah, definitely, you know, if, especially at like a medium, larger business, it makes a lot of sense to have like an internal to team, that's, you know, tasks with this. You know, speaking strictly from the engineering side, we have a lot of the similar things right, like a data analytics team or a dev ops team that does like deployments to production. Having something for AI is it makes sense. You know, it's a really fast-moving field. It's hard to keep up with.

Blythe Brumleve: 54:22

And there was a quote of open AI founder, or one of the founders, sam Altman, and he said about 95% of marketing tasks are going to be handled by AI, which some marketer and freight. You know it kind of freaks me out, but it's I try to avoid like sort of the doomerism around it because I, I think to you know my own examples. Like you know, a social media, for example, social media managers, didn't exist 15 years ago. That was a job that evolved from tech. You know innovation and tech evolution, and so I guess how are you Combating maybe the the doomerism that exists when you're talking to customers, maybe fear of being replaced within your own organization, or maybe you know you're building a business that might be obsolete in five years? I think that's a questions that a lot of folks who are studying this are having. So how are you, I guess, sort of Coming to grips with a lot of those questions or how are you thinking about it?

Garrett Allen: 55:16

Yeah, no, it's good. I think kind of like what I was saying earlier. I think you know, even like with marketing, right in the 95%, I don't. I don't know anything about like where those numbers come from, but just thinking about it, you know you could go on today and chat to BT and so they write me an article and like generate me an image for the thumbnail and like, if you posted that article in that thumbnail, people would immediately know you generated this with AI. It's not, this is. It's not genuine. It's kind of like the same thing. I've read eight times with different like things plugged in. So I think you know that same thing applies everywhere.

Garrett Allen: 55:47

You know if there's, if there is a, you know a service that's really transactional. Where it's like hey, we just do we get a, we give B, there's no like thing we can do better here. They just want this done. They don't care. Yeah, it's great.

Garrett Allen: 56:00

If you're like hey, we're trying to make a sale, trying to land a customer, trying to give like a really excellent experience, just plugging in AI is probably not the answer. Maybe you can help, but just plugging it in isn't really the answer. You want to use it to give a better experience, but not replace it. So I think when you're looking at AI again, it's just being intentional about the problem you're trying to solve, basically, and not just trying to make it easier for yourself, because people can tell and they and the thing with people is They've always, you know, appreciated that the genuine person, the real original content, stuff like that you can always tell, you know, even like going on YouTube videos and stuff like that. And you've got how many channels on YouTube that are the same exact channel with like a different person that's doing the same things, and it's like they all get lost because it's like it's the same thing, the ones that are popular, the ones doing it different, and I think that's just always gonna be true.

Blythe Brumleve: 56:52

Yeah, I would agree with that, and another, I guess, sort of use case that I saw. That blew my mind and I don't. I don't think this is exactly true yet, but it involves a you know the conversation around AGI and that I guess more expanded large language model of currently what we're using and it was basically like a lot of pixel characters that were sitting in a pixelated office and it looked like a little video game and you had different departments. You had a marketing department, you had a sales department and they're all these little characters that are all working at their individual desk and they're all talking about Looking at their individual desk and they're all talking to each other independently and solving problems. Do you think that that's a real sort of like? Would I be able to Build my own sort of mini marketing agency where one team is handling SEO, one team is handling web design? Is that a future that that's in our near? Is that a near future? I guess.

Garrett Allen: 57:48

Um, it's definitely a future. Uh, is it a near future? I think it's hard to tell. The tech is moving so fast. Possibly.

Garrett Allen: 57:55

Um, I think you're still going to run into a lot of the same issues where, if you've got just an AI handling SEO, for example, like it might not do the best job. It's going to do like the bare minimum standard, kind of if you really want to like crush it, you probably have to like get a person in there. But I do think, um, a more near future is that you know I kind of mentioned again with Lloyd partner and having like the different AI is working in collaboration. It's it's really approachable to have like an AI that specializes in like SEO, like just it just does SEO. I think a more near future would be, um, a company that has AI sort of broken out, similar to like whatever departments they have in places.

Garrett Allen: 58:32

Or you know, like how you do your marketing agency, where you have like, okay, I'm going to be doing SEO stuff today. Uh, I'm going to go like work with my AI that focuses on that specifically to like get feedback on it. Uh, and you know, a company might have an AI that's trained for helping with marketing At different parts of marketing and that's what those employees used to like help them out. Um, but I think having like a full-blown company that's like entirely AI run essentially is probably not quite as near future, just because, um yeah, there's just too many edge cases, too many things to think about. I think we could, you could probably like technically do it. It would just not be, it would just be pretty poor.

Blythe Brumleve: 59:09

Basically, yeah, that makes sense that you, that's probably the images that I was seeing, but are they? What is the actual like work output and is it decent enough that somebody would actually pay for it? My guess right now would be no, it would not be something that that people would pay for. Um it? As we were talking about google search, you know I thinking about google and you know the sort of their big snafu that they had a couple weeks ago when their launch of gemini, and you know all of the, I guess the the pr backlash of Of their programming. I guess where do you sort of see google's standpoint or alphabet, I guess, as the their parent company? Where do you see them fitting into this AI arms race? Do you see them being competitive or was that snafu that they had a few weeks back? Is that sort of indicative of? You know, just, I guess, sort of larger data set problem?

Garrett Allen: 1:00:00

Yeah, I don't know. I've actually, you know, to be honest, I've been really surprised with how far behind I felt like they've been Since sort of day one. Um, you know, google's always been sort of like the AI, like forward software side, especially company. Um, I was really shocked at, like, how poor their first like reveals were and like how they've just really struggled like the whole time, kind of here. Um, you know, I think honestly that they, I think they are just legitimately behind at this point, um, and just try and catch up and it's just so everything's moving so fast, it's just what. You're behind, it's just so difficult to catch all the way back up.

Garrett Allen: 1:00:35

Um, so, yeah, I don't know, I think you know, one of the things that they probably do have an edge on is they're talking about some of these smaller models and like getting them into their phones and like kind of going to grok's, you know, chipsets that are made for llms, uh, something similar for the phones where they've got these AI processors and the phones that help, you know, run these small models directly on the device quickly. They probably have an edge there. Um, so, maybe not so much in like the great general purpose everything models or even like you know your enterprise stuff, but more of like the small fast can answer a quick question, like your google assistant kind of stuff. Um, I think that's where they might have something you know, sooner than later probably.

Blythe Brumleve: 1:01:17

Which is crazy, because they have access to so much data. So it boggles my mind how they could be so far behind on this. It just feels like they they've kind of uh, you know, just got caught with their you know their pants down Is essentially, and just caught, caught surprised. Or maybe they just don't want to mess up their Entire business model, which all comes from search. I think it's something like 80 percent of their revenue comes from search, which is crazy. So they probably don't want to upend that so, but then they're gonna get replaced by it Anyway. So it's it's really weird to watch them struggle over over these last few months. Do you think that?

Garrett Allen: 1:01:51

they recover? Um, yeah, I think that they. I think they will, just because, um, I think, like the, the Android devices just give them such an edge on like getting an AI of any kind into someone's hands. You know, chat, gbt and like clawed three like benchmarks, wise, they look great. Enterprise wise, they look great.

Garrett Allen: 1:02:14

Um, nobody's really done like a, a Google Assistant, a Siri, an Alexa. Yet that's like really generative AI. Like it's still nobody's quite done it. And Google has like an experimental one out that I've messed with a little bit on my phone, where it can actually replace your Google Assistant with like Gemini. So it's actually like generative AI, but it doesn't have all the tooling hooked up so it can answer questions like chat, gbt.

Garrett Allen: 1:02:39

But if I say like, what's on my calendar today, I can't tell you yet, but it's like coming soon. So if they can get like that, they can bridge that gap before anybody else. I think that will get them kind of caught back up in a sense, even though it's sort of a different area. It'll just be like okay, google's doing something with AI. They were the first to like the personal AI assistant, sort of. So that's what I would expect. I know there's some like rumors about Apple doing AI stuff behind the scenes and doing different stuff like that, so I did. There's gonna be a lot of interesting stuff on that, like personal assistant, ai stuff within the next like two years probably.

Blythe Brumleve: 1:03:12

How do you keep up to date with all of the news that drops, and experimenting and trying to focus on building a business, but also trying to stay up to date with all of these different tools. How are you managing it all?

Garrett Allen: 1:03:25

It's hard, you know. Honestly, sometimes it's a lot. I think the big thing for me is I just I follow a lot of the companies on like LinkedIn and Twitter to see when they make announcements. Hacker News is always a go-to for me to see like kind of the new tech news for the day, and then otherwise, yeah, I mean there's stuff like you know, groc, when it first came out with the fast token stuff, like I just saw by chance somebody sharing that on Twitter one day and I was like oh, what's that? And then, like the next day or whatever, they announced like a really fast with the Gemma model, with Google, and I was like, oh my gosh, this is crazy.

Garrett Allen: 1:03:58

Yeah, it's just. I guess it's just kind of being like in the news feeds a bit and like just trying to keep up with like the AI tags and trying to weed out. There's, you know, there's some people I feel like that you can tell like post more regular stuff. That like is actually news, versus like some people that'll post stuff and it's like this was from like a week or two ago, you know so.

Blythe Brumleve: 1:04:17

Like the AI newsletter roundups, like I feel like. I'm subscribed to. You know a half a dozen of them right now and they're all reporting on the same thing, so it's probably time to go and clean up my subscriptions a little bit. I am curious. You know a couple of last questions here what tools, I guess, in the AI realm? Are there any AI tools that you're using outside of load partner?

Garrett Allen: 1:04:40

You know, honestly, funny enough, I've actually done some transcription stuff before, like last year a little bit more Reid was doing please advise, and like the Reid and all them they we have a Tech for a Tuesday thing they do on the Discord and last year, for a month or so, we were doing like summaries of those nights and so it was really cool. We were basically using Amazon to transcribe it and then we were actually feeding the whole thing into Claude, my Anthropics, claude, to get a bullet pointed summary. And I was really impressed by that, because we would have three, four hour conversations and we would just plug it in and it would give us like here's, like the 10 topics you covered and like what people said oh, that's awesome. So like, oh my gosh, so that's cool. And then, yeah, I mean I use a GitHub co-pilot for programming. You know it's funny because you know I would have probably said like there's no way I would ever be that guy that uses that. But oh my gosh, it's like life changing, it's so easy, how so.

Blythe Brumleve: 1:05:41

What does it do for folks who may not know?

Garrett Allen: 1:05:43

Yeah, so a GitHub co-pilot is basically like a, an LLM that helps you write code. So what you can do is like if you have a file open, you can just literally add a comment and be like I want a function that does this and it'll spit out a function that does that. And it's not perfect, but it does a really good job, especially if you're like I need to write a function that, like I know I've written six times and like I just I just don't want to type it out. Basically, you just write the comment, let it do it. And it's funny.

Garrett Allen: 1:06:11

One of the things I was telling, I tell my co-founder is you know, in programming there's a lot of you know back and forth about how much you should leave comments and like what does or doesn't need comments. Well, you know, five years from now, I think it's going to be funny when we go back and look at this old code that people have written, like myself, and you're going to see comments that are actually like prompting AIs. It's going to be like you know a comment somebody left behind that was like write me a function that says blah, blah, blah, and then like there's the function like, oh, like he was using an early version of co-pilot, probably when he wrote this clearly like an AI prompt. But yeah, it's funny. I love it though, so you know it's helpful.

Blythe Brumleve: 1:06:45

No, it definitely sounds super cool, especially with your expertise in your history within the freight industry. I think you're you know, you're coming from a perspective of you've already built it manually. So now how, if you were to build, I guess, sort of a modern day TMS like, what would that look like? And it sounds, you know, a lot like what load partner is, you know, a perfect compliment into the day-to-day life of you know brokers and carriers out there. Is there anything that you feel is important to mention that we haven't already talked about?

Garrett Allen: 1:07:16

Oh no, I think you know just the biggest thing is you're with AI, I think, just be intentional about how you want to use it and, you know, I think, find a good partner. And that's just really important as far as, like you know, staying ahead of sort of the AI hype train. It can be really easy to get sucked up into it and be like we got to plug AI in everywhere. I think it's just, you know, think about what you're trying to solve and, you know, find a good partner that knows what they're doing to help you solve that problem with AI or not.

Blythe Brumleve: 1:07:41

Yeah, well said. Where can folks follow you follow more of your work. Sign up for load partner, maybe.

Garrett Allen: 1:07:46

Yeah, yeah, so you can find us at loadpartnerio. Just one word you can email me. Always happy to talk about this stuff, garrettatloadpartnerio. And then you can also find me on LinkedIn and then also on Twitter garrettunderscoremakes.

Blythe Brumleve: 1:08:00

Awesome. We will put all of that in the show notes just to make it easy for folks. But, Garret, this was an awesome discussion. Thank you for entertaining my super basic intro level questions, but I think hopefully this was a good educational lesson for the folks who have been curious about AI and freight and maybe want to learn a little bit more and be prepared for the future. So thank you for your expertise and your time.

Garrett Allen: 1:08:21

Thanks for having me on, it's fun.

Blythe Brumleve: 1:08:23

Absolutely. I hope you enjoyed this episode of Everything is Logistics, a podcast for the thinkers in freight, telling the stories behind how your favorite stuff and people get from point A to B. Subscribe to the show, sign up for our newsletter and follow our socials over at everythingislogisticscom. And in addition to the podcast, I also wanted to let y'all know about another company I operate and that's Digital Dispatch, where we help you build a better website.

Blythe Brumleve: 1:08:53

Now, a lot of the times, we hand this task of building a new website or refreshing a current one off to a co-worker's child, a neighbor down the street or a stranger around the world, where you probably spend more time explaining the freight industry than it takes to actually build the dang website.

Blythe Brumleve: 1:09:09

Well, that doesn't happen at Digital Dispatch. We've been building online since 2009, but we're also early adopters of AI, automation and other website tactics that help your company to be a central place to pull in all of your social media posts, recruit new employees and give potential customers a glimpse into how you operate your business. Our new website builds start as low as $1,500, along with ongoing website management, maintenance and updates starting at $90 a month, plus some bonus freight marketing and sales content similar to what you hear on the podcast. You can watch a quick explainer video over on digitaldispatchio. Just check out the pricing page once you arrive and you can see how we can build your digital ecosystem on a strong foundation. Until then, I hope you enjoyed this episode. I'll see you all real soon and go Jags.

About the Author

Blythe Brumleve
Blythe Brumleve
Creative entrepreneur in freight. Founder of Digital Dispatch and host of Everything is Logistics. Co-Founder at Jax Podcasters Unite. Board member of Transportation Marketing and Sales Association. Freightwaves on-air personality. Annoying Jaguars fan. test

To read more about Blythe, check out her full bio here.