26: AI-Generated Episode

26: Dev Chat "AI-Generated Episode" Season 3 | Breaker | Castbox | Overcast | Pocket Casts | RadioPublic

Cory, Phil, and Jack talk about the recent explosion of machine learning algorithms, how they will affect what we do as developers and designers, and how we do it—if we still have jobs at all.



Cory (00:00:15):
Welcome back to In The Loop, a WordPress Agency podcast by Blackbird Digital. I'm Cory Hughart, and in this episode, I'm joined by Phil Hoyt and Jack Watson to talk about the recent explosion of machine learning algorithms, how they will affect what we do as developers and designers, and how we do it if we still have jobs at all. If you have questions about WordPress website development, contributing, or anything else web-related that you'd like to hear us discuss, send an email to [email protected]. You can also find us on Twitter, Instagram, and TikTok as InTheLoop_WP. Blackbird Digital is a web and app development agency that specializes in WordPress, creating onscreen experiences that connect, teach, communicate, and inspire. Visit for more information. Enjoy the show.

Cory (00:01:08):
Welcome back for another Blackbird Dev Chat—with a twist. Due to the topic under discussion today, Phil and I have invited Blackbird's UI/UX designer Jack Watson to join us. You may recognize her from episode 13 with Bet Hannon, where we discussed web accessibility. She has an extensive illustration and animation background, and she's a featured live streamer on Behance, as well as an Adobe mentor and Adobe live host. And full disclosure, we're also married. Welcome to Dev Chat, Jack.

Jack (00:01:51):
Hey, thanks for having me back on again. Excited to—Apprehensively excited? <Laugh>To talk about the topic today.

Cory (00:01:59):
You came back for a great one.

Phil (00:02:01):
This is, yeah. There's no controversy in this one at all. It's just gonna be a straight shot. Easy to talk about podcast.

Cory (00:02:09):
So it's taken us a long time to get comfortable with the idea of doing an episode about the current state of generative AI—text, images, code—because it is fraught with valid ethical concerns about how the data for these AI models are sourced and processed. We will be setting that debate aside for today to focus on the technology and the potentially positive and negative impacts it will have and maybe is currently having on our work as creative agency developers and designers. So, with that said, AI is here.

Phil (00:02:54):
AI is here.

Cory (00:02:56):
It's coming for us.

Phil (00:02:58):
<Laugh> Speaking of just AI is here. I just kind of like, while you're going through that intro when was like this current version of AI that we're all kind of like meshing with, cause you know, the word AI's been thrown around for the last several, several, several decades, <laugh> at this point, but this current version when did you start to like this idea of it start bubbling up and start, start entering yours? Like—

Cory (00:03:20):
Started making me anxious about the future, you mean?

Phil (00:03:24):
Sure. I whatev I mean, I did not say anything about your feelings, <laugh>, but, you know, if that's where you wanna start, because I mean, for me, like, I remember it was like the DALL-E stuff. Yeah. It was like these really like-

Cory (00:03:36):
Postage stamp—

Phil (00:03:36):
Cursed images Yeah. Of, of like, Ooh, look, I typed in some words, and some of the most disturbing images came out of it in those first few months. And, you know, people started just going to that you know, DALL-E light website and you could pump out a handful of little squared images.

Cory (00:03:54):
I did a couple of those just to—you know, the novelty factor of like: Let's see what it does with two completely opposing concepts and, and what kind of, you know, horrors of the imagination, it produces <laugh>. Right.

Phil (00:04:11):

Cory (00:04:13):
When was that? That was what, like—

Phil (00:04:15):
It feels a while ago at this point. I would love to nail that down exactly the date, but it feels like about like maybe a year and a half, two years ago, I guess at this point, maybe where we started, like, you know, you'd be on Twitter and you'd just see like the three by three grid of like, look at what I created, I put Pikachu plus old-timey hat, and I got the most creepy thing in the world. <Laugh>.

Cory (00:04:40):
Well, since then, things have rapidly developed <laugh> for sure in only a year or two. Some examples being, you know, not just DALL-E, but just image generation in general has exploded with the Stable Diffusion type generators and Midjourney and those sorts of things. ChatGPT has entered the chat, so to speak, <laugh> with the large language model stuff. And then of course things like GitHub Copilot for code autocomplete generation kind of stuff. And I know there's more going on with that in the future. I there's also, you know, I haven't looked into these much to be honest, but there's also like, apparently a completely free Copilot-style thing from Amazon.

Phil (00:05:36):
Oh, I had no idea.

Cory (00:05:38):
So all those things are happening.

Phil (00:05:41):
Sure. I think those are like the big hitters that, you know, I think when you hear AI, those are the ones people think of right now.

Cory (00:05:48):
And then in the world of image generation, let's just say, you know, Adobe is, this is where the Adobe stuff is relevant. <Laugh> from Jack's intro Adobe's of course working—I mean, they've been working with machine learning stuff for a long time, so they're sort of entering the race with Adobe Firefly, but we'll get back to that in a second. But I wanna set the stage right, so, you know, in terms of text generation stuff. So, you know, we saw, you know, not too long ago, maybe six months ago or so certain scientific journals just getting overrun with—and like short story competitions and things like that, just getting overrun with obviously generated submissions.

Phil (00:06:38):
Sure. Yeah. It sounds like some art submission type stuff also, you know, photo photography type contests; I think a winner was like an automated submission

Cory (00:06:47):
Well, yeah, there, there was that controversy a while ago about the just art competition, like the AI generated art, that won a art competition.

Phil (00:06:56):
But I guess in general with the main subject we're talking about is just the, you know, these previously user-generated submission type forums are being flooded with AI content

Cory (00:07:07):
You know, in terms of code, Stack Overflow banned you know, generated responses, answers and art portfolio websites, of course. <Laugh>, either took two paths, right? There's two paths that the art portfolio websites have taken. One being just let it all in and have essentially become, as far as I can tell, basically AI art showcases and, you know, it completely transformed from what they used to be. Right. Or they banned AI art submissions and, you know, I don't know how that's working out for any of them necessarily, but that's, that's where we're at. Or at least that's where we were at a couple months ago. I'm, you know, I'm trying to keep up with these things, but I know today or maybe yesterday, whatever people are losing their minds over some generated video where there's like a bunch of people dancing and I don't know, there's like anthropomorphic cats,

Phil (00:08:15):
<Laugh>, I saw a few of these, you know, I am on the TikTok as an old person would say. And yeah, there was, it was definitely a flurry of people getting their hands on this newer video AI type situation. And yeah, they were typing in things like: 'pandas at a salon', and they would get, again, we we're back at that DALL-E stage of things where it was this cursed images of panda's cutting other panda's hair. And it is pretty disturbing. But you know, it, it's that case where it's like, wow, this is novel at the moment. But I bet in the next probably few months, unfortunately, it'll probably be like, okay, well, yeah, we can type pandas cutting each other's hair, and we're getting pretty close to what we're looking for. So, so as much as it's a joke right now and maybe scary, it'll probably come to fruition in the very near future.

Phil (00:09:08):
So yeah. At the rate that everything is moving, I'll be honest, I don't put a lot of stock in a lot of these like, weird negative things. Like I see a lot of it, but nothing in my life has been impacted greatly by this you know, I'm not on any of these communities. I'm not on YouTube seeing a ton of generated content that's like overflowing my feed on the socials or anything. So it's hard for me to weigh in on these. I'm sure it is pretty disturbing for some people. I think where I have seen it, and at least maybe where we can kind of pigeonhole this conversation for our podcast is like starting to see it inside of the WordPress sphere. You know, we're just now starting to see plugins and blocks kind of integrate with AI.

Phil (00:09:53):
If you're familiar with the Notion writing application, it's a very Notion-esque where you can kind of ask it to write you some you know, you prompt it, like you do lots of these text generators and it kind of outputs you know, text. I think that is one version that we're starting to see in WordPress. And then we also have this other like, weird other side, which is a little more hands-on, but it's like:Hey, ChatGPT, write me a plugin that does X And we're starting to, we're starting to see that, I kind of do feel like those are just you know a trend to people who are just trying to stay relevant inside the YouTube and Twitter spheres.

Cory (00:10:32):
Yeah, there was a, there was that moment like a couple months ago. I mean, we're still on the tail end of it, I think, but early 2023, I will say, where, you know, every let's say personality that, that I follow in the WordPress sphere was: "I asked ChatGPT to write a WordPress plugin, and this is what happened," you know, <laugh>. Yeah. And, you know, whether they were doing that just for the content to like say one way or another, like, this is great or this is bad, or, you know, I mean, there are some people, you know, not just trying to generate plugins, but utilize ChatGPT or you know, image generative AI in plugins. So our friend of the podcast, Ryan Welcher, he does streams every Thursday just about, and not even a couple weeks ago he was kind of finishing up an example plugin.

Cory (00:11:33):
Some of his plugins, he puts on the repo, I don't know if this one is going on there or not, but he was working on a image generating plugin for WordPress. So seeing a lot of it some people might be familiar with it: alt text generating plugin that's been making the rounds, all sorts of, all sorts of things. Yeah, the image generation, I don't know. Ok, so theres, we've got... I'm kind of lumping the code generation with the text generation stuff. I mean they are kind of, they're trained on kind of different sets of things, but essentially they're outputting language. And then like the image generators are kind of a different ballgame. I wanna try at least to summarize how they work and probably get it terribly, horribly wrong, but bear with me, <laugh>. So <laugh> for these diffusion models, you know, they're ingesting thousands, millions of images that are tagged in some way. So there's human labor element associated with this, right? And that gets back to the ethics, of course, but so these images are diffused, right? Like they're literally—

Phil (00:12:58):

Cory (00:12:59):
Blurred. Yeah. Right. Like kind of noise added, and you know, then they're un-blurred, or at least that's kind of how the generating part works, is it's starting from random noise or some sort of input and trying to un-blur an image. And I think that this was kind of one of those happenstance inventions where like somebody's, you know, I remember 10 years ago when up-resing images was very new and, you know, people taking really old pixel art and like up-resing it to be like super bubbly and whatever. And then I think this is kind of a strange offshoot of that, that became more of a, you know, cultural phenomenon than just up-resing stuff. I mean, we've seen up-resing stuff a lot, even like I'm sure in games and stuff like that where they're releasing remastered versions and probably using tools like Photoshop to up-res things, or at least—I actually don't know. I'm assuming that Photoshop has like an intelligent up-res.

Phil (00:14:08):
I would assume as well, I know Nvidia and whatnot have added lots that to their offerings. You can like, turn on settings that'll up-res your games, essentially.

Cory (00:14:18):
In real time?

Phil (00:14:19):
Yea, real time. So, yeah. So that's already a thing. I can actually get some links for you. I'm not gonna even attempt to kind of describe how any of this AI stuff works, unfortunately, but just cuz I really don't know. But I do know those tools do exist in real time for video games. Like, you can run it at 720 (resolution) and up-res to 1080 and it actually looks even better than running it natively at 1080 at in some cases.

Cory (00:14:40):
Before we move on from the image stuff I don't know if Jack, if you'd be willing to give us kind of an overview of like what machine learning kind of tools already exist in the Adobe Suite and, you know, maybe what we can expect potentially, not necessarily officially, but just your opinion or what you're expecting.

Jack (00:15:00):
So, yeah, so in regards to Adobe stuff, there's kind of multiple tracks going on right now. There's a lot of stuff going on in the machine learning and AI world. So we've had Adobe Sensei for a long time. People have been using, whether or not they know it AI machine learning in Adobe apps for a long time. You know, back when I believe it might have been content-aware was the first thing that kind of came out in Photoshop.

Cory (00:15:30):
That was like, when we were in college, which I'll let other people date me. <Laugh>

Jack (00:15:35):
That was using machine learning. That was kind of like the birth of Adobe Sensei in that timeframe. And so we've only kind of moved from there until like now we can, you know, the smart captioning stuff that's going on in premiere, Illustrator's got new image traces using Adobe Sensei to do like smarter image trace stuff. So there's like that side of things, right? They're kind of like beefing up the tools that exist inside software that is meant to be used by creatives to, you know, amplify their work elevate their work. And then we've got on the other side of things, Firefly is kind of like its own thing that's more traditionally models, right? Using data sets to generate images. There's a whole bunch of kind of like in-exploration features that they're looking into. Obviously like video stuff is a big one.

Cory (00:16:30):
And audio, of course.

Jack (00:16:31):
As well as like, you know, sketch-to-vector I know is another big one that people are excited about. But they're also, at the same time, they're kind of developing all of these, two kind of tracks, tools inside of software for creatives, the image generation kind of stuff. Where those two collide is yet to be seen. They're also a part of this Content Authenticity Initiative, which is similar to, I guess the Glaze kind of thing that's out there, although in kind of like a different way so a bunch of artists as a way to kind of like, I don't know, not like get back at AI but they've started using Glaze, which kind of protects on some level your artwork from being pulled by these,

Cory (00:17:23):
By being trained on

Jack (00:17:25):
Right. And so it kindof, like—

Cory (00:17:27):
I'll link to that. Yeah. It's really interesting. I almost wanna say it's almost like a booby trap too, right? Like, because it won't properly understand it and it could become, you know, that training set could become useless, I think, I don't know. But it's also the sort of thing where like the Glaze tool itself and the the generators are going to have just like the same way that we see, you know spam detectors and spammers, right? You know, at a constant arms race with updating the tools.

Jack (00:18:07):
Yeah. And then I guess that's the thing is that no matter what solution we come up with to try to like <laugh> get our arms around the legalities of, you know, content in this new age it's gonna be a constant battle. You know, like the only thing I equate it to is like: before, you could use whatever password you wanted to on the internet, right? It didn't matter. We're building more and more complexities as like, I don't know, where does it end kind of thing. But the content authenticity initiative is actually like an open source project. But the idea behind it, and there's a beta out currently, so you can I'll get the link for you after, but you can feed images into this sort of beta and it will, and it actually exists as a beta in Photoshop too, so you can save your images in Photoshop using with the Content Authenticity credentials.

Jack (00:19:07):
And what that means is that if you upload an image into their, you know, Content Authenticity checker, it actually will provide you information on where the source of that image was from, and like what edits were done to it. It was originally meant to sort of combat deep fakes. But now obviously with the rise of AI, they're trying to use it to kind of detect whether something has been created using image generation tools. So it can pick up on whether something was made in Firefly, for example. And it'll tell you that even if you alter the image and crop out the watermark, change it, you know, take a piece of it and put it in something else, it'll still pull as a source. Even if you make an image composite, it'll actually show you all of the like, source images that created that composite.

Cory (00:19:54):
If I understand correctly. Even if you like, take a screenshot of the image?

Jack (00:19:58):
Yes. Even if you take a screenshot of the image.

Cory (00:20:01):
That blows my mind.I don't even understand how that works.

Phil (00:20:02):
That's interesting. Cause I assumed it would like encode it inside of like, you know, metadata

Phil (00:20:06):
Something. Yeah, the metadata and then even the chain of like gobbly-gook inside of the code of it, the image itself, but seems like it goes further than that.

Jack (00:20:15):
What Adobe's attempting to do is build a, so Firefly, the image text image generator they have, it's got a couple of different things in it now. It's got like a text effects thing and it's got like a vector recolor option as well. But Firefly is trained on Adobe Stock and public domain imagery. So the idea behind Firefly is, they're trying to build a tool that can be used commercially. In order to do that, they have to own the license to all the work in the data set. And so how the Content Authenticity stuff kind of factors into it, is that they wanna be able to prevent content from being uploaded to Adobe Stock. Obviously that's been generated using AI tools, right? They only wanna have content in there that's, they own the rights to, and so they're trying to build this kind of like AI, but like, this is where it gets complicated in that it's always gonna have to be a constant battle, right? Between like <laugh> the tool that gets built to protect artists versus, you know, people trying to get around it, right? Like, there's always gonna be people trying to get around. I mean, it's been like, that way going back to people removing watermarks from images. Like this is not a new sort of issue.

Cory (00:21:38):
Like tracing images. I mean, that's kind of a higher order.

Phil (00:21:43):
I mean, there's an entire like, area in another country of people just repainting famous paintings, so yeah.

Cory (00:21:51):
Yeah, I mean if it's digital, then sure. No, that was a dumb, nevermind <laugh>, I'm sorry. I'm thinking of like, you know, because both Jack and I used to work in the publishing industry specifically textbooks and yeah, I feel for people, you know, doing production work in that field right now, because it was already a sinking ship when we got out of that industry, you know, not too long ago. Everything being outsourced and, you know, and now, like, because, you know, correct me if I'm wrong, Jack, but you know, a lot of the asks for art production for textbooks is just like: here's the art or chart or whatever that we want, just remake it.

Jack (00:22:39):
Mm-Hmm. Yeah. I mean, it was, that's sort of the industry in a nutshell is 90% of what we were creating was: here's this artwork, render it out of copyright. Here's this artwork, render it in our style. There was a lot of those kind of asks. And so the AI is definitely going to disrupt that industry and production art as a whole, which is kind of, you know, bittersweet in that, like a lot of the skills I developed in that job certainly benefited me, you know, moving up in my career in terms of like learning how to work faster and more efficiently, <laugh>. But, you know, at the same time, like you said, it's kind of a race to the bottom. And I think that, and this may be a bit of a, a hot take, controversial opinion...

Jack (00:23:38):
I think it's, I like to use this term a lot, like AI work for creatives rather than against them. And I think that there's something positive, if you can find a positive spin to, you know, job loss would be that it frees artists up to do the work that they actually want to be doing. I mean, I was pretty miserable. You know, just kind of recreating work out of copyright essentially. Like, first of all, that's already kind of shady, right? You know, <laugh>.

Cory (00:24:17):
I wanna come back to how it's is affecting, you know, people just starting out. I wanna save that bit for later. I am curious about your take bringing it back towards kind of what we do at the agency level. You know, now that Adobe is set to acquire Figma. I don't know if that's gone through yet.

Jack (00:24:43):
I know that's a whole thing,

Cory (00:24:45):
But, you know UI stuff is imagery, right? Do you still foresee a time when, you know, we're generating entire Figma compositions, <laugh>?

Jack (00:25:01):
This is where it gets complicated. And I think that eventually we will get to a point where AI can create interfaces on some level, but AI at this point, is really bad at context. And it's just really bad at like, understanding, you know, you can only really give it bits of information and have it create sort of static singular things. It doesn't really know how to build a system.

Phil (00:25:33):
Yeah. I'd like to build on that a little bit later, especially when we get into like how, you know, I'd really like to talk about how I practically use it in my day-to-day, but I think when we're talking about AI and a lot of people are afraid of it and, you know, they're thinking about it as this wholesale solution to a large complex problem. But in my practical applications of it, and I know we're still in the early days, it's been more piecemeal, you know, I'm using it to do bits and pieces of my you know, work to get certain tasks done and then using my experience and talents to combine them all. And, you know, I can definitely see a future where we ask, you know, have a built-in Figma plugin or something that, where it's like: Hey, generate me some color swatches, get me button styles, you know, maybe experiment with typography and stuff like that. But the idea of like saying: Hey, build me a website, <laugh>, give me a whole layout is like a little bit of a hard ask for I think anything you know, no matter the, a human or organic intelligence or artificial intelligence. Yeah. That is hard to ask cuz there's just so much context that definitely needs to go into those asks to be able to get the appropriate response out of it.

Cory (00:26:45):
Yeah, so transitioning back into, you know, the code side of things there was <laugh>, there was that thread that happened just recently that I'll link to of somebody, you know, very confidently telling someone else like: "oh, that that thing that you spend hours on every week or whatever, like, oh sure, ChatGPT can just make a program for you in, in 20 minutes" or something like that. Right?

Phil (00:27:12):
Yeah. I think it was kind of insane. It was something like: Hey, this task takes you 200 hours to do, we could automate in the next 20 minutes. But more or less they found out, but they couldn't do that. <Laugh>

Cory (00:27:23):
<Laugh>, I'd like to point out how it's funny how the, I don't know how to describe this. Like, it's no longer just about like, oh, we can automate you out of having to do that, i.e.: "your job". Right? Automating people out of a job has been a thing for a long time, but we can automate automating you out of a job. <Laugh> <laugh>.

Cory (00:27:49):
But you know, the thread itself, again, I'll link to it is just, you know, it's the cautionary tale of our current time, right? If we take a snapshot right now, you know, getting back into the language model stuff, specifically in code generation of course is, you know, if you've got a very complex problem that hasn't been automated yet, probably for a reason because it's complicated and then you just try to ask ChatGPT to make you a program to do that. Basically <laugh> basically this thread you know in stages is like: oh, actually that didn't work. So we had to be more specific here. Oh, that didn't work. We have to be more specific here. Oh, that didn't work. We, you know, I had to get into the code and adjust these things, et cetera, et cetera, et cetera.

Cory (00:28:42):
And it just got to the point where you start just seeing the day-to-day life of a programmer just like what they have to do to make a thing work, but like sticking this, you know, machine in the middle that is not intelligent. I mean, we keep calling it AI right? But there's no thinking. And I know there's all sorts of arguments and I'm not getting it into about like, well, it's doing the same thing that human brains are doing, but whatever it's, we have, we have more context than just what is being fed into it. Right. Just at this moment in time, again, things will get more complex in the future, but for now, you feed it a prompt and based on that context alone and potentially, you know, earlier prompts, it's giving you output. Right?

Cory (00:29:32):
And, yeah. Anyways, it's an interesting tweet thread. But, you know, that language models, right? And you know, again, the same thing that that does code generation, these work a little differently to the whole, like, you can't, you can't blur text, right? It's a different thing. It's more in line. It kind of reminds me when I was reading about this of how you know, in code in your editor, how it is tokenized to like you know, give you syntax highlighting, right? But on a very granular scale where tokens might not be single words, but might be pieces, parts of words even. And basically, as far as I understand it right now, the way that these text generating algorithms work is they are going, not word by word, but token by token and just finding the the most probable next token, just going in order linearly.

Cory (00:30:43):
Just, that's why when you ask ChatGPT a question, you can watch it type out the response to you. Right? Because it's literally just plopping those tokens in one after another as it generates them. So with that in mind, I struggle to articulate my qualms with this whole, this whole thing, this whole thing happening right now. And, and the zeitgeist that we're in with, you know, generative technologies. And the best way that I feel like I can put it succinctly is that it's a lot of excitement about a thing that kind of pulls everything into the average. Sure. You know what I'm saying? Like, like, you know, these things are just finding the next most probable thing, which is gonna be basically an averaging of everything that you've put into it, more or less.

Phil (00:31:37):
I think that's, yeah. And I think the best examples of that is if you go to like something like ChatGPT and ask it to write you like an essay, like a very basic essay about something simple, we keep seeing like the example of like, "tell me about the moon landing" which no offense to everybody doing that, but I can find something better to do.

Phil (00:31:56):
<Laugh>. Yeah. It's almost as bad as the co-pilot example of: "How do I center a div?" Like can we, can we drop the meme? But but if you ask it to like, write you a very simple essay, it just kind of has a bunch of filler words. It's like, oh, this happened this time, and then a whole bunch of filler words, and then this happened, and then a whole bunch of filler words. And the let's repeat the whole essay again really quickly in one sentence, just to like, you know, seventh grade level, sixth grade English class style, and then call it a day. And like yeah. It's like when you read it, you're like, okay, yeah, you just averaged out like five different articles into one little paragraph, which is, you know, sometimes useful.

Cory (00:32:36):
Yeah. That's the thing for me is that it's not coming up with some sort of outline and kind of putting these concepts in order. It's going token by token. But that's where the prompts come in, I guess. And you have people calling themselves prompt artists now <laugh>,

Phil (00:32:58):
Yeah. But I mean, we had people calling themselves ninjas and rock stars and yeah. Like whatever, like the people are gonna, like, people are gonna be people, they're gonna try to LinkedIn-anize the whole system and try to figure out a way to make a job out of it and, you know, whatever. Let them do their little thing.

Cory (00:33:17):
I have to admit that I also tried, I had to, of course it was, it was everywhere. I tried generating a plugin for WordPress. It was, you know, just an idea that that came to me because actually because of this very podcast, so just to set the stage you know, we've been trying to figure things out regarding social media and in particular, Mastodon, right? In The Loop does not have a Mastodon account. I have one, I post things about the podcast and about all my other interests. Right. But the podcast itself, the brand, if you will, doesn't have an account on mastodon, and I actually put out a poll there because it just, the feeling of the network, it doesn't feel like the place for brands. I don't know, maybe I'm wrong. So, you know, I wanted to know if people, you know, wanted to see that sort of thing and it was kind of inconclusive.

Cory (00:34:18):
But you know, one thing that occurred to me during that process was that the website itself that we are posting transcripts on, which by the way, are not completely automated, but there is AI involved in that process, right. As a first pass. So that's one point to AI there. So yeah, anyway, so the website is posting transcripts, you know, about each episode, and there is a plugin called Activity Pub that you can install and do a lot of configuration to basically turn your website into a little activity pub node that can be viewed and subscribed to and even commented on in Mastodon and other activity pub, you know, federated stuff. Right? So the one thing that was stopping me from that is that the way that it works is author based. So you can subscribe to authors like they were users on an instance.

Cory (00:35:19):
And I realized that if we wanted to publish these articles on mastodon as something that you could subscribe to, it would be nice if it was like [email protected]. Sorry, that took a while to set the stage, but that's where we're at. Right? So I, you know, I figured why not write a plugin to, you know, have a specific custom post type, always have the same author, and that way all the posts about the podcast for the podcast episode transcript, custom post type would show up as that author. And I thought that this was a simple enough thing to kind of try AI out on. Obviously it didn't work very well, <laugh>, the short of it. Right? You know, it got strange things wrong, honestly, that I wasn't expecting. And I don't remember exactly what those were, but it was just like, you know, hook names or something like that, that I wasn't expecting it to get wrong.

Cory (00:36:19):
But once I fixed those things, I realized like, well, just like if I was developing this myself, I would realize, oh, I can't just change the author, you know, on save or whatever. Like there's, you know, there's a backlog of posts potentially with a different author, and what do you do with those? And those are places where developers have to make decisions about like, well, what is the experience of using this plugin like? Are there settings, is there some batch process that you can go to in the backend to like batch process all the, the posts to swap them or, you know, all those things that like prompting a generative text machine is not gonna have that context for. So it just, it ends up being at least my experience with code and these prompting interfaces.

Cory (00:37:12):
My experience has always been having to write more and more specific, basically pseudo code, which is basically what a prompt is. I mean, we try to do as natural language as possible, but yeah, you start to do more and more pseudo code. You start to, you know, narrow in on things. Like, okay, this part of your code worked, but this function didn't rewrite that function specifically. And I got fed up with it and, you know, it wasn't gonna work for me that way. I don't know, maybe some people got that working. Really, I expect that like the people who got a plugin working were just asking it to do a thing that had already been done. You know what I mean?

Phil (00:37:56):
Sure. I something simple. Maybe, I mean, if I could take the reins for a moment, cuz I feel like I've had a different experience with AI like especially in the last, this year specifically. You know I knew it was gonna happen at some point, so I'm gonna gonna say it now. I, for one, welcome our AI overlords. So, I feel like that was an obligatory quote at some point that I had to make. But I've had a very different experience in general. I use ChatGPT and AI in general on a daily basis now in my workflow. It's, you know, it took a while to integrate it. I actually was like, not a late adopter, but definitely not the first adopter. I think you were the first adopter of Copilot at the company. You were using Copilot and then you were talking about it in some dev chat.

Phil (00:38:41):
And I was like, well, you know, maybe I'll install this and give it a go. And, you know, and my experience with that was very interesting. You know, I kind of, I didn't really read the docs or anything. I was just like, it just, it had predictive text, which was kind of nice. And then you kind of keep moving along with that, and then you kind of find out that you can actually like, oh man, I just need to massage this data from one place to another place, and I just need a function that does that. And I don't wanna go Google the, the function in PHP that does that. Like, what is the code that does that? And so you write just like a comment in your code and you're like, a function that does X, Y, Z and then it kind of just spits that out for you and you're like, wow, thank you.

Phil (00:39:19):
That was just a little bit of brain work that I just didn't want to do right now, and now I'm just like kind of seamlessly working. So that was kind of my [foray] into like just starting to, you know, appreciate what AI could do for me on a, on a line per line basis. But it kind of like kept evolving from there. You know, I was already a user of midjourney for my D&D campaign, stuff like that. So I wasn't like against any of this stuff to begin with. But when chatGPT started coming out and getting better, you know, I started going in there and, you know, doing the novelty things. I had it right obituaries for myself and my fiance. And that was fun. And you know, and then it, but then it kept going from there.

Phil (00:40:01):
I was like, okay, well what else can I do? And it kept getting better and better. And you know, on a practical side of things, you know, I had it recently, you know, rewrite three of my very old blog posts and these are blog posts that had little code snippets. And I kind of just like let ChatGPT take the wheel a little bit. I basically like copy and pasted the code in maybe a little bit of my writing style and basically was like, you know, read this code and rewrite this blog post to be more professional <laugh>, you know, write this thing for me. And about three of my blog posts now have been rewritten with ChatGPT in mind. And, you know, these are tasks that are difficult for me. You know, the code side of it was like, fine, I asked it to optimize it and some of this code is old and definitely needed a little bit of optimization, which is, you know, nice of it to like, oh, hey, we read through it and you know, we found a few little like: Hey, you're looping through the variable in a weird way here and blah, blah, blah.

Phil (00:40:51):
And, you know, we can rewrite some of this. And maybe I even asked it to use PHP coding standards for WordPress and it, you know, did that to some degree as well. So but yeah, essentially three of my blog posts got rewritten in a, you know, in an afternoon of me just kind of feeding it in there and massaging the data a little bit to get out paragraphs of texts that would've never came outta my hands naturally because I'm just not that type of person. I don't even believe I speak very well, let alone type very well. So I was very happy to have blog posts that other people would benefit from with my code that, you know, people would be able to then use. So that was like an interesting experience in the recent, in recent history.

Cory (00:41:32):
Correct me if I'm wrong, I mean, that kind of sounds like automating a, you know, a task, a dreaded task for programmers, which is, you know, not just writing of course, but like, it sounds like, you know, automated refactoring, even if it's not specifically about your code, you refactored your blog post.

Phil (00:41:54):
In a lot of ways. It is though, but yeah.

Cory (00:41:56):
With like more approachable language or maybe some, and you know, maybe you didn't describe much at all and you had it describe in natural language what the code is doing and that sort of thing.

Phil (00:42:06):
Yeah. I mean, it commented things that I had left out in the first pass just cuz you know, maybe naturally I was just typing it to solve my own problem and, you know, I wasn't overthinking about somebody who'd go into that code and have to read it themselves.

Cory (00:42:19):
I just wanna stick a pin in this idea of using a large language model to do language tasks, <laugh>, continue <laugh>.

Phil (00:42:27):
Sure. Yeah, yeah, yeah. I mean, and even to continue that train about language tasks you know, I was recently asked internally to provide a description of an API integration and, you know, I wrote a little blurb for myself and it wasn't very professional, but it was just like what I could get out at the end of the day when they needed it. And I then ran it through ChatGPT basically just asked it to make it more professional. And it did that for me. And I was like, thank you. Like, you know, again, these are words in a way that I wouldn't, but going back to even just like I'm a more granular scope when I'm writing code. You know, I had a you know, a large feature ask and you know, I wasn't again asking it to wholesale create me a plugin or a feature or anything, but I, along the way while I was doing my own coding, I would ask it to just do certain tasks for me.

Phil (00:43:14):
I was like, you know, hey, I'm working with a lot. It had to do with, it was, I was building a calendar, you know, I needed a lot of date and time stuff. And so I was asking it to like, Hey, I need an array that starts today and ends at the, this end date. You know, this end date is based on, you know, a database entry from WordPress. And it outputted for that for me, and I had to massage it myself, and then I wanted it that array to be, you know, filled with certain pieces of information that it's getting pulled from a post type and, you know, and so on and so forth. I kept kind of asking it to do these like, heavy thinking pieces that would've required me to like usually just go skim through a couple tutorials or examples or Stack Overflow and pick out the pieces that I need to then build this visual, you know, calendar for me.

Phil (00:44:00):
And, you know, instead of having to do that, I asked ChatGPT to kind of do that heavy thinking, lifting for me. And I was able to get you know, it to a point where I was able to actually accomplish that feature in a much faster timeline than I would've if I had to have done all that work myself. Which again would've been just me sifting through data to get to those answers. It wouldn't have been me, you know, maybe handwriting a handful of it, but some of that would've just been like, you know, creating a calendar from scratch is a nightmare and a half. It's just, just gotta like, juggle a handful of ideas in your head and it's just not always the easiest thing to do.

Cory (00:44:38):
Sure. But so yeah, you're directing it more or less you have, you have the context as the human, you have the full context about what needs to happen and you're asking it to, you know, do more and more kind of specific things that are kind of automating the grunt work of, you know, there's still, there's still grunt work involved in programming. Like, it's not all, I don't know what people think we do <laugh> if they're not developers, honestly. But, you know, it's a lot of typing, right? <Laugh>. And, you know, even before you know, Copilot or those sorts of things existed you know, we've had tools for a long time with, with our with our code editors to autocomplete things for us because the boilerplate code is not fun to write. And boilerplate's a great use of, I think you know, generative language stuff. Like, gimme this, you know, scaffold this out for me, and then I go in and fix the details kind of thing. Which is interesting, like, so bringing it back to image stuff, right? All the stuff that I see is very like, you know, I mean, it's generating a raster image and a raster image. Well, I'll let Jack explain what a raster image is, <laugh>, since she's the designer.

Jack (00:46:10):
I don't wanna get too deep into the weeds. But yeah, a raster image is, you know, an image created out of pixels right? And you only have a specific number of pixels to work with depending on your document settings. So let's just kind of look at image generators that exist out there right now, you're getting a flat, usually PNG I think maybe a JPEG but generally PNG of raster artwork. And if you've ever tried to scale, you can't scale up unless you use AI tool, right?

Cory (00:46:50):
You can with an AI tool, <laugh>

Jack (00:46:52):
<Laugh>. But generally just to keep it like high level, if you try to scale up a raster image, you're gonna see those pixels, right? Because that's going over a hundred percent, you're gonna start to see them. So that's like what we mean when you say raster art versus vector art. That's, you know, SVG land

Cory (00:47:12):
It's code underneath. It's math.

Jack (00:47:13):
Yeah. Right? We are making paths, right? And those paths, you can define certain appearance attributes to those paths, but they're infinitely scalable, right? Because they're just paths based on like point positioning. You can scale 'em up and et cetera, et cetera. So that's different than, you know, raster art, which is made of pixels, you know, raster art doesn't have any of those nice points and paths that we can transform and translate as we see fit. We have to, we're limited to how it was made originally with the original kind of pixels.

Cory (00:47:46):
That just sounds like, you know, because, we basically what we've been talking about a lot here is like kind of, you know, yes, sure we make use of some of these tools, but we're not, we don't expect it to give us the solution. Like hand us the solution, we expect it to get us a certain percentage of the way there, at least over 50%, generally speaking, right? And then we go in and we edit the contents and we can do that with text, we can do that with code, we can do that with the transcripts for these episodes, right? We're reading through the entire thing as the audio is playing and fixing each and every word if it didn't pick it up correctly, right? How do you do that with a raster image?

Jack (00:48:33):
You don't <laugh>

Cory (00:48:34):
Unless you're a, you know, you're a master painter that can do, you know.

Jack (00:48:38):
Right. You'd have to edit it in a raster image editing program. And so that's kind of like the interesting limitation that exists currently. Obviously I can only speculate that vector is the goal of any of these image generation tools companies, right? They all wanna get to vector. But to just kind of take a step back for a minute for what I do, you know, raster output isn't particularly useful, if we just look at what I do at like Blackbird.

Cory (00:49:17):
We're not designing websites with Photoshop anymore?

Jack (00:49:20):
Right? Like, it's not exactly useful for what I do. And so right now it's an interesting toy, but I mean I've even, you know, even just kind of like, it's been hard to figure out how this kind of stuff fits into my workflow. I've tried using it to kind of like, how can use it to generate ideas, mood boarding kind of stuff, ideas for branding, ideas, just for colors, just mood stuff. I even still, I feel like it has a hard time understanding, again, getting back to context like what is what it can't understand me as a creative, it can't generate things that are, it has yet to create something or have been like, yes, that is what I had in my mind for this project. It's not very good at that kind of stuff. And so, obviously my hope for AI is to see it more integrated into tools that can take what I have in my head and help me, like massage it and get it to, you know, get it out. I think a struggle for a lot of, cuz there's multiple audiences for AI tools, right? There's people like me who want to be able to utilize it to elevate their creative work. And then there's the other side of things which are people who want to generate images wholesale. And those are kind of like two competing audiences with different needs.

Cory (00:50:39):
I just, I need somebody to explain to me. I'm hoping maybe you can, like, what are these people expecting to use these images for?

Jack (00:50:49):
Yeah. I mean eventually the point of it would be for commercial use, they wanna be able to generate, I mean, that's the difficult thing and what we're seeing a lot of people get frustrated with is, you know, part of my responsibility, I run some events in the Firefly discord for Adobe, these prompt jockey events, and I see people get very frustrated when they give me a very, very specific prompt and they have something very, very specific. They want the image to generate and it can't do that. It can give very general like it'll give you a spectrum of things, right? Based on like, you want a surreal landscape. It'll give you things that'll give you a variety of those things. But like, it's not, you know, it, the joke is about like, you know, fingers counting number of fingers on a hand, right? Like, or just like, you know, I had somebody really want me to generate three kittens for them and it could not count three kittens out, right? Like it couldn't, it didn't understand that it doesn't, doesn't

Phil (00:51:49):
Know the number three or that it's even drawn what a kitten is. It just is like, well this blurry image eventually looks like what you think is a kitten.

Jack (00:51:57):
Rig ht? And so it'll just like pick spots to put kittens where it thinks kittens might go. People literally want to be able to create images that are in their head and there's just no way.

Cory (00:52:10):
No way to do that unless you spend 10, 15, 20 years <laugh> doing that yourself with a pen and paper.

Jack (00:52:17):
Right. And, and they get very frustrated when the tool isn't able to do that. And so, yeah, I don't know. I think that like, when people do get access to these tools and start using them, they're gonna be sorely disappointed that they can't make a logo type, make a cool logo type like but it can't.

Cory (00:52:38):
The only thing I can think of, you know, personally that you could maybe possibly utilize these things for, and maybe what we'll probably see first if it's not already out there is the, you know, is replacement for stock images because already stock images, I mean, none of us sitting here like, or enjoy stock images and we make fun of them a lot, right. When we see them in the wild, like, this has nothing to do with your company or whatever. It's just got a general vibe and you're like, okay, that's great for this marketing copy or whatever. Right? So like, I guess I can sort of see it there, but...

Jack (00:53:23):
I mean, to get back to what we were talking about a little bit earlier and what I kind of mentioned is that there is a benefit as much as it is kind of a hot take to say, to give people the access or the ability to generate the images that they need if they can't afford an artist to do that for them.

Cory (00:53:41):
Well that's why stock art exists in the first place, I suppose.

Jack (00:53:43):
Right. And so there's a benefit there. Like I would much rather, I don't know, I would much rather people who—it removes like a level of like gatekeeping, right? And I know that's like artists might get kind of angry at me for saying that.

Cory (00:53:58):
Is it gatekeeping, though, to like spend your, dedicate your life, learning how to do a thing like with your own hands, and then people come alonglike: why can't I do that in 20 seconds?

Jack (00:54:11):

Phil (00:54:12):
Well, I remember when DSLRs hit the scene and they can do video and oh my god, we can get depth of field out of a camera for a thousand dollars and the cinema world losing their god damned mind because I went to art school and film school and I have to use a $60,000 camera and a $20,000 lens to achieve this look. And you're just getting it for a thousand dollars. It's not the same. It can't be the same. And it's like, well, it is. You better get used to it. And yeah, it's only gotten better and better and better and cheaper. And cheaper and cheaper.

Cory (00:54:43):
Until we have it in our pockets.

Phil (00:54:46):
Yeah. Oh, yeah. Well, I mean, going even to like in your pockets, like imagine you are a small business. You have, you sell X widget and you just need product photos to go on your website, and you don't have great lighting equipment or anything. So you pull out your phone, you take a photo of it, you put it into Lightroom, and you say, remove the background. Give it a nice reflection, make it all look shiny, remove the blemishes from it. And you know, that's a few clicks instead of like, you know, somebody who took, you know, went to school to learn Photoshop to like, you know, painstakingly remove this.

Cory (00:55:16):
And that's a great example, but I will say that you already have to know to do those things to make it look good. Like there are, you know, people who have never, ever done this, that sort of creative work specifically before are not gonna know what they don't know about why a professional thing looks good versus their you know, their image.

Phil (00:55:41):
I mean, yeah. I don't think they might know the exact terminology. And of course, and unfortunately I did go to art school and I do see, I do see a world where Canva has a button that says "product photo", and it literally just has an example of a photo, like their crappy photo they took out of their camera and then what it looks like afterwards, which is just it on a white background cleaned up and then, you know, upload my photo hit button and do it. They don't need to know that it removed the blemishes or anything. They just know that it looked better than it did out of their phone.

Jack (00:56:12):
Yeah. I mean, to get, again, like, if it's a matter of a startup paying me to do branding work for them, being able to afford to spend more time on that, on the actual brand development because they have the ability to use tools. You know what I mean? I don't know. Like there's, it gets back to my point about like the give and take of the things I would rather be spending my time doing.

Cory (00:56:41):
Take that to the extreme, right. The client, not to the furthest extreme. You know, the furthest, furthest extreme would be the client wants a logo and they can type it in and they get exactly, somehow what they thought of in their mind and they're done.

Jack (00:56:59):
Great. I mean, if they can do that, but it's probably gonna look like... <Laugh>.

Cory (00:57:04):
That's what I'm saying here. So like, you know, to dial it back a little bit, they, they don't wanna spend a lot of money, but they need an, you know, identity for their business, which you think, you know would be worth spending something on. And, you know, they generate something and it's like, I don't know, they kind of keep getting close. I, that's the kind of a recurring theme I hear a lot with AI generation. Like, well, it's kind of got the concept, but it's not really what I was thinking of, or I wanted it in this composition and it's doing it in this other way or whatever. Like, so now your work almost entirely transforms into: please fix this AI generated logo instead of developing it from scratch, just from a conversation that you're having with them.

Jack (00:57:53):
Well, I think it's gonna be, that's gonna be up to the individual designer to look at that work and say, this is a great starting point, you know, for you to get your ideas down. However, now we need to take that and treat it like a mood board and develop your brand. You know, I don't think we have to become, you know, beholden to AI generated artwork as designers. I mean, that's where our expertise comes into play. And we can say like, listen, I know that you created this thing or whatever, and I know that you kind of like think this is the end all or the be all, but like, you know, again, a context here, you know, we need to think about the audience. We need to think about your brand traits. Does this kind of align with them? What can we take from this? What, you know, there's always like a compromise. Clients have been coming to us with sketches forever, and we have to be like, okay, hold on. Or like coming to us with image ideas all the time, right. It's not like a new problem to solve, right. Then we have to kind of step in and say, I am glad that you're 13 year old drew your logo idea for you on a napkin. But let's bring it back to reality <laugh>.

Jack (00:59:09):
It's the same kind of thing. AI is just replacing, you know, your 13 year old cousin who draws really good.

Cory (00:59:16):
But also I still, I still have a problem with that, even. Even that, right?

Jack (00:59:22):
Well, the crux of it, it's, you know the, the issue is that where this all kind of stems from, right? Is a lack of respect for the craft that we perform. And that is the reason why these data sets were even like, nobody would... If there wasn't enough respect for what artists do and designers do and creatives do. Nobody would've even considered using just a straight data poll from the internet of like, everybody's work. And like, it's fine, we can just take it because who cares? They don't have any value. Right? And so that's the larger discussion. Is that like artist, creatives—creatives, I mean in like the whole sense of the word, even programmers, et cetera, coders, and I think you prefer coders. There's a lack of respect for what we do. That means that it's okay to just take advantage of our work. And that's a, that's a bigger discussion, right?

Cory (01:00:23):
That's capitalism baby.

Jack (01:00:24):
That's symptom. It's a symptom of the AI problem, right? Like, and I don't think that artists like creatives again, as a whole should be arguing or splitting hairs over like, you know, which AI tools are bad. And like, we need to be having a larger conversation about, you know, compensation and respect in the industry as a whole.

Cory (01:00:49):
Yeah. But, and then there's another conversation that I wanna have too, getting back to how it cuz we tabled this previously about, you know, how it affects people just now getting into the industry, whether it be artists or programmers, honestly. I have no frame of reference for what it's like right now to be learning one of these crafts to be in school with things that purport to be able to do your homework for you. Like I have no, I don't know. It's, crazy, right? But what I'm, what I'm kind of interested in and what nobody, I don't think anybody's gonna be able to answer, but, you know, I wanna speculate about is, you know, what this does to people just starting out. We, you know, we personally know plenty of people at least you know, artists, if not other programmers that are completely despondent that this is happening right now.

Cory (01:01:53):
That, that you can just go onto a website and type in some stuff and it, and it creates a beautiful image potentially. Instead of, you know, them spending hours, days, you know, developing something. And, you know, a lot of these people, the whole idea of imposter syndrome, right? You know, is just compounded by the proliferation of these tools, right? And so, you know, why bother learning the craft of how to create these illustrations or paintings or code or whatever from scratch when apparently it can be just generated via the, you know, you know, the conglomeration of all, you know, all human knowledge on the internet just rolled up into a ball and, you know, thrown at the wall <laugh>

Jack (01:02:51):
Because it's not good.

Cory (01:02:53):
<Laugh> Well, we say that now, but even getting back to like, okay, but now it can generate videos that, that are convincing and they will be even more convincing just like we saw DALL-E, you know, go from DALL-E to whatever Midjourney or stable diffusion or whatever. Like,

Jack (01:03:08):
I think that like with any technology that's come before us and will come after us, people are always going to be adapting to it. I don't think that I know a lot of artists who have just completely checked out of the internet now who are just like: I'm just gonna sell my work at conventions. I'm not gonna post anything online. I don't wanna even participate in social media or online culture anymore. You know, there's been a lot of backlash in the industry as well. Game studios banning the use of AI, even for in any level of their work that, you know, sticking our head in the sand isn't gonna save us. It's here. Plenty of corporate entities, plenty of clients are gonna utilize the tools. Again, it comes down to like a respect issue less than a tool issue. You know, AI isn't the problem.

Jack (01:04:01):
Machine learning, like the, the software isn't, the isn't the problem.

Cory (01:04:04):
Capitalism is.

Jack (01:04:06):
<Laugh> Yea. Absolutely. And so when it comes to like being a new artist or a new creative or a new anything in this new world, those people are gonna adapt just fine and they're gonna move on and whatever comes out of the other side of this, whatever they're producing as the new new generation, that's gonna impact us more than it's gonna impact them. Like, if you are driven enough, you're still gonna come out the other end and you're gonna become a designer, a developer, whatever it may be. Like that's not gonna change.

Jack (01:04:50):
It just may not be something that we recognize.

Cory (01:04:52):
So I'm definitely the old man yelling at the cloud essentially.

Jack (01:04:55):
Yeah, absolutely.

Cory (01:04:56):

Phil (01:04:57):
Yeah, I've been told since my first day working some 15 odd years ago that my job will be replaced and automated at some point, and I'm still going, I'm still here. I, you know, for better or for worse you know, my, I'll say this, my job today doesn't look like it did five years ago, and it didn't look like, like what I did five years ago didn't look like it did the previous five years and the five years before that. And in five years from now, my job won't, today won't look like what it does then. So you know, we are in the technology industry, it's always gonna be changing. I, I thought the same thing about new people coming up when React became a thing. I was like, oh my gosh, I can't even imagine trying to get into this right now.

Phil (01:05:37):
It's so complicated, you know? But it's not for them, you know, it's new for them, it's fresh for them. It was complicated for me because there was a different way to do it for so long and it was hard for me to learn how to do that kinda stuff. So I think if you, you know, and I, you know, I said this in our, you know, we were talking about this podcast episode coming up, there are easier ways to make money. If you want to be doing something else, you can go do that. But if you want to be a developer and you wanna be a designer and you enjoy doing these things and you know, this is where you, you know, sometimes it is just a job. It doesn't always have to be a passion, but, you know if you are someone who is in this field, which butts nicely up against the creative field in some, in a lot of ways you're, you're gonna become that thing. You know, I don't think this is gonna be what ruins it for you. You know? I don't think jobs are of some sort in that field are gonna disappear. I just think the work and the job is gonna change and it's always gonna change. That that's the nature of the beast.

Jack (01:06:35):
Yeah. I mean, I do recall very often being told, in my previous life as a production artist, essentially, in the print industry, that print was dead. And we just need to transition to all digital. And that certainly hasn't been the case. What's happened as a result of that is cheap print is dead. I would say that specialty printing has had a huge revitalization.

Cory (01:07:04):
Is that what we're gonna be relegated to specialty websites?

Phil (01:07:11):
We already are, like Squarespace, people make Squarespace sites to make their simpler websites. They, you know, but they come to us cuz they need a bespoke thing. They need something that, that doesn't offer them. And the those offerings will keep getting better. You know, hopefully maybe I, you know, and that's okay. That just means we get to do more bespoke things and more specialized work. Like we're gonna be building out designs that those things just can't do and, you know, it's just gonna keep going that way. You know, technology will keep advancing, but the asks from it will keep growing as well. You know, they always do it. It's never one-to-one. It's never like, oh, everything I wanted is exactly what technology is capable of doing right now. Yeah. It's like, oh no, technology got better. You know, I can now have 64 gigs of Ram for a hundred dollars on my computer, but now I need, I need more out of it for some reason. It's like why because I'm a human and then we're always gonna want a bigger better.

Cory (01:08:05):
Always wanna be making something that stands apart from the sort of the baseline, right? And if now, you know, you know, generated anything is the baseline. Where, where do we go from there? I dunno.

Phil (01:08:20):
I was on TikTok this weekend and some creator was lamenting that there is a Figma plugin where you hit a button and you can copy a website and they demonstrated it by going to and oh my god, now it's all just in Figma. Apple's design is in Figma and there were, you know, I'm gonna, I'm just gonna quit as a designer and blah, blah, blah, blah, blah.

Cory (01:08:39):
That's not the point of that tool though, right?

Phil (01:08:41):
No, it's A: it's not the point of that tool and B: like, I'm sorry if you just wholesale gave that to your client, it'd go like, why'd you just hand me Apple's website?

Cory (01:08:49):
<Laugh> The screenshot of

Phil (01:08:50):
Thank you for the screenshot of Apple's website. You're fired. We're gonna move on.

Phil (01:08:54):
Or even if it did get, let's say for some reason it got through all the checks and balances and went to the developer, the developer built it and like, and then all of a sudden your customer base is like, Hey, why does this website look like Apple's website? Like that, that's not like a good smell <laugh>, you know, like, Hey, you, you are a boogie board company and it looks just like Apple's website. Like, like no. Like, yeah. And it's just not gonna, so, you know, I of course had to leave my comment cuz some other people were like, my job's being destroyed by AI. I do PPC. I'm like, I doubt it. Like, I don't believe you. You're, you're go cry. Well, please quit, but I would love for you all just to quit, get off Twitter, stop polluting my feed. And you know, those jobs can just keep coming my way and our way and you know, anybody who wants to keep moving on. Cause technology is always gonna keep moving forward and it's just, you know, there, there's nothing we can do about it.

Jack (01:09:45):
If you're in this industry to make money, then you, you made a mistake. <Laugh>

Phil (01:09:50):
Probably if you are in this side, if you're on the production side of it, it's probably true.

Cory (01:09:56):
We're not doing it for the money, we are, we're self-motivated to sit at our desks...

Jack (01:10:00):
No, we're definitely doing it for the, for the money. But <laugh>

Phil (01:10:03):
Yeah, no, we're definitely,

Cory (01:10:04):
We do need to survive, but like, yeah, I don't know about you all, but like,

Phil (01:10:08):
I didn't keep doing it for the money, let's put it that way, <laugh>. Yeah.

Jack (01:10:11):
That's, that's a better,

Cory (01:10:12):
This is what my brain does. This is what, you know, gets me through the day. <Laugh>,

Phil (01:10:19):
I always have the hardest time talking to people who are getting into boot camps or whatever because, you know, I've been, you know, doing technology as early as I can remember. I remember being in like fourth grade playing with little JavaScript stuff and little HTML code and just kind of being a weird kid on the school computer. And you know, there's even photos of me before that as a kid just like playing, like taking apart VCRs and putting them back together. You know, it, that's just who I've always been. And it's so it's hard for me to like someone and if you find your, that you wanna do this work and later in life, there's nothing wrong with that. But I I, it's hard for me to relate to that. You know, you can totally do it. It's just, you gotta want to do it. At the end of the day, there's no, there is no magic, there's no magic switch bootcamp college that you can just go through, learn it, and then have all the skills that you need to be able to do this job. You're gonna have to care about what you're doing at some point.

Cory (01:11:10):
So since we're getting to the end here, let me try to summarize <laugh> what I'm hearing from our conversation today is that yes, absolutely part, at least parts of our job are absolutely being automated. However you know, we can either embrace that or, or quit <laugh>. And if you embrace that, and doesn't necessarily mean that you have to use it, but if you embrace this idea that parts of what you are doing are being automated regardless of the other ethical considerations, of course, it's maybe hopefully a motivator to do it even better. And, you know, stay ahead of that sort of baseline curve that the generated stuff is always gonna be, you know, inclined towards, right? It's always gonna be in the middle of that bell curve because that's how it works.

Cory (01:12:13):
So, and you know, considering that's always been, you know, what technology does and what humans do yeah, maybe this is just the next—we certainly grew up with the new paradigm of the internet and then after that, right? We saw, even in the, just like the coding, the programming sphere, right? Like Google completely changing how programmers do programming, and then Stack Overflow probably was another huge paradigm shift. Like people can just like look things up and make code that works. And that's certainly how I learned. You know, it might not be a copy paste job anymore, but you know, that whatever that next phase is with stuff that is generated from all of that knowledge the next generation's just gonna have to deal with that. I don't know, it should be interesting to watch.

Jack (01:13:09):
Yeah. I wanna make it clear that I'm not an advocate for tools that take advantage of intellectual property.

Cory (01:13:17):
I don't think any of us are. Let's make that clear <laugh>.

Jack (01:13:19):
No, I mean I am cautiously optimistic about, you know, what Adobe's trying to do with the content authenticity and with Firefly using a fully licensed data set. But obviously I'm still cautious about all of that. You know, I don't wanna see—I'm very much against people being taken advantage of for the sake of profits. <laugh> building tools that don't compensate. And you know, to kind of go off of what you were saying, you know, the certifications and stuff like that, ultimately none of that really matters because the, like, it's your creative problem solving that is your value and, and driving force. And if you don't have that and you're just motivated by money or certifications, whatever, then yeah, I can see that like you would be heavily dissuaded to pursue this industry. But there will always be people who are looking to be creative problem solvers. I don't think that's gonna change. It may change what that looks like in the future.

Cory (01:14:40):
That's a really good point. Even after Google came into existence, there are still people that refuse to figure out, you know, the answers to their own questions with just, you know, just a click away <laugh> There seems always to be room for people like us who are: Yes, that's a great way to put it: creative problem solvers that just know how to make stuff happen.

Phil (01:15:10):
I think we're all gonna be just fine in a year from now. We'll all be laughing just like we do about NFTs and blockchain. We'll just be like, oh, it's just a, it's just another thing that kind of exists.

Cory (01:15:20):
Either that or all out of a job <laugh>

Jack (01:15:22):
People are gonna be just sorely disappointed if they think that like, I'm just gonna magic what I want into existence. Yeah. Right. Yeah. It's, everything is still gonna require some level of effort. Yeah.

Phil (01:15:33):
Some level of effort. <Laugh>, you're gonna need specialized people to be doing things and yeah. If you're some CEO thinking you can lay off your entire engineering force and just start typing away.

Cory (01:15:44):
Well, we see how that turns out, <laugh>. Yeah, we're watching that in real time, so.

Jack (01:15:48):
That's happening <laugh>.

Phil (01:15:52):
It is what it is.

Cory (01:15:54):
All right. Well, on that note, I think we gotta end this here. But you know, I'll be interested to, you know, see if this, you know, gets one day our, our podcast gets fed into a generative AI so that we can just sit back and, you know, generate episodes for people to listen to. Wouldn't that, wouldn't that be great? Isn't that what you want?

Phil (01:16:15):
It's been here the whole time. <Laugh>.

Jack (01:16:17):
They have voice modulators. You could have somebody—

Phil (01:16:20):
Descript actually already does that, to be fair. Like, yeah. We'll, we'll talk about this off-air <laugh>.

Cory (01:16:27):
All right, well here's the awkward ending. And cut.

Cory (01:16:29):
That's all for this episode. Check the episode description for links to things we mentioned in the show. And don't forget to send your questions, thoughts, and fan mail to [email protected]. You can also find us on the web at and on Twitter, Instagram, and TikTok as InTheLoop_WP. If you're interested in having a WordPress website custom-built, or you want to join a team that does that, head over to our site at and drop us a line. Thanks for listening to In The Loop. See you next time.

28: Writing Block Themes with Justin Tadlock

Clips Links (02:07) Justin on Ryan Welcher’s Thursday Twitch Streams: (04:50) Archived version of (05:23) Justin’s writing for the WP Tavern:

View Episode

Let's talk.