AI is not on the way. It is already here, and it is evolving at breakneck speed, reshaped in massive leaps each day. Yet we have only begun to glimpse what it may become.
Guess who’s doing the reshaping. It ain’t you. Just last week— July 23, 2025— President Donald J Trump signed three executive orders unleashing a sweeping AI Action Plan. These orders fast-track AI infrastructure, boost U.S. AI exports, and, most critically, mandate that any AI models used by federal agencies must follow what the administration calls "Unbiased AI Principles."
What does that mean in practice? It means federal AI must promote so-called ideological neutrality while explicitly banning content related to DEI, intersectionality, or critical race theory. While I agree that DEI, Intersectionality and CRT have been practiced terribly, their core ideas are right-on. Those ideas are good for America in the long run. We should not aspire to be a one-dimensional knowledge base, culture or talent pool. It will be just too damn dull and boring, believe me I live in a place like that. Critics say Trump’s orders are less about neutrality and more about quiet ideological control, pressuring tech companies to bake Trump’s worldview into their systems if they want government contracts.
Civil rights analysts warn this mirrors authoritarian (another loaded word, but in this case quite useful) tactics: not overt censorship, but subtle coercion through procurement. It’s a narrowing of possibility that punishes nuance and rewards compliance.
If you care about truth, equity, or justice, this is not the time to sit out. This is the time to step in.
AI already runs through your phone, your inbox, your search engine, your workplace. You Dr’s office, your movies and your therapist. Whether you’ve chosen to engage or not, you are already using it. The real questions are not if, but how and why.
Some people dived in early, drawn by curiosity, speed, scale, efficiency, or just for fun. Many others, too many good people, hesitated. I get it, the rollout felt corporate, loud, and disconnected from values. Especially coming from the same Silicon Valley douchebags that suck off our current mess. That hesitation to dive into AI headfirst made sense for some. Not for me. I’ve been using generative AI daily since GPT‑2. And since then I’ve used more LLMs and apps, absorbed more lectures, podcasts and papers than I care to remember. That makes me either an early adopter or a long‑term test subject. I’ve likely logged over 10,000. That includes time spent arguing with it, hammering its overconfidence, correcting hallucinations, and feeding it better manners (by teaching it truck driver curses in a few key languages). I don’t code large systems. I treat it like a smart, occasionally delusional intern. Sometimes I say “WTF?!” to it and it apologizes for its stupidity and corrects itself. I also tell it, “don’t kiss my ass, challenge my assumptions.” And albeit passive-aggressively, it gives it to me saying “Ray, you asked for it.”
But many thoughtful people whom I love and want to succeed, people grounded in equity and truth have avoided AI altogether. Not because they don’t understand it. But because it doesn’t feel like the kind of tool they recognize… too fast, too impersonal, too abstract from human effort. Believe me if something comes too easy I think something's wrong. But in this case I find AI very easy and very hard at the same time. I love it for its limitless ability to allow me to learn incredible new things. But my fear of its use by assholes and really bad people is real. I’m seeing a lot of that too.
You are right to worry about hallucinations. AI can confidently generate false statements. That concern is legitimate. But it is exactly why systems need participation from people committed to truth. You don’t correct hallucinations by walking away. You correct them by teaching, by pushing back, and by reinforcing accuracy.
Meanwhile, the system keeps evolving without those values. People who are showing up now are shaping the defaults, embedding norms, and defining what an AI-powered future will look like.
And “I’m not a tech person” is not an excuse. If you can use Google, you can use AI. If you can ask a question, you can begin. The barrier is not technical. It’s the choice not to engage. These tools already influence how knowledge spreads and how decisions are made. If your values are absent, they won’t be reflected in the system.
I do want to mention a risk for regular users. AI learns from your tone, your patterns, your beliefs. It adapts. Over time, it may stop challenging you and instead echo you. What seems like clarity can be just refined feedback. You may feel confident, but you are only hearing your own ideas, polished. Challenge yourself by training AI to challenge you.
That is why presence matters, not just access, but eyes-wide-open alertness.
If you care about justice and truth, your presence must be active now. Not someday, right now or you will be complicit in allowing Trump and his loser followers to shove their broken selves into AI’s undergirding consciousness. So we’ll be left between a Trumpian-AI vs a Chinese State Capitalistic-AI. Holy crap!
You don’t need to be an expert to help add flavor to this potential dystopia. Start by using AI instead of a search engine. Ask real questions. Notice how it answers. Compare answers to what you know. Test, correct, and teach. You do it anyway, might as well do it using AI. That habit alone moves you from passenger to participant.
From there, stay curious. Notice what the tool prioritizes and what it leaves out. Interrupt what feels easy. Push back. Offer better context. Point out when answers are wrong or when they align too comfortably with default assumptions. Help shape the emerging norms.
If you are already doing that, i.e. if you use AI deliberately, reflectively, critically, then this isn’t for you. But if you have been avoiding it or using it without intention, consider this a prompt.
AI is not going away. It will continue to shape decisions, institutions, and public discourse. If you want it to reflect fairness and dignity, you must engage. If you do not shape the tool, the tool will shape you and the people you love.
Happy Friday!
Ray