I have become somewhat addicted to CoPilot AI by MicroSoft. My wife got me hooked when she worked at the Tractor Shop. You could look anything up and it gave you the information without the paid advertiser sponsored rankings of a search engine.
Even obscure information like test pressures on a White powershift or tracking arm mass for a Rek-O-Kut 120 tone arm, or wet sump issues with your Triumph 650.
For example, if I asked CoPilot how to wire a Delco single wire it would not just give me 50 adverts and twenty lame-arse lengthy and rambling YouTube videos, it would give me a little discussion and tell me which wires went where. It would tell me the field terminal letter cast on the housing. It would do it all in text so I didn't have to spend ten minutes watching some moron tell me about his Honda Accord.
I have been noticing issues with CoPilot. Earlier this year I asked it how to make flank steak in the Traeger and I got a whole diatribe about CoPilot not being allow to give step by step instructions. Not really asking for a step by step, but rather the three-two-one Traeger method and if it would a good thing to do on this cut.
When I asked it how I could free up the stuck action in a 1911 Winchester 1890 .22LR it refused to guide in in disassembling a Firearm. I pointed out that the 1911 might actually fall under the Curios and Collectables act and by any means was not a freaking AK-47 it merely doubled down on the refusal. I had to rephrase a number of times to get any information about how the feed ramps work on the Model 1890.
Today I asked CoPilot about the compatibility of two agricultural chemicals and it nearly had a AI melt down. It cited legal and moral responsibility in mixing agricultural chemicals. I pointed out that I was not asking anything that was not included on the labels and those labels were public information and actually gave the mixing and compatibility instructions but I just didn't want to read 15 pages of fine print on a iphone. It told me it had gotten an upgrade that limited its ability to provide that information.
| These are screen shot results to different questions I have asked CoPilot over the last week. Obviously you can do workarounds by phrasing the questions differently but that is just silliness. |
Then, to add insult to injury, when I asked it to relate the cast of "Parks and Rec," to the Trump administration it nearly blew another gasket. It was not allowed to make disparaging remarks about political figures.
I countered with Trump as Leslie Nope, the hero of the show it became happy again as it thought my comparison was complementary. It was really not and while I was amused that I got it to cast Tom Haverford as Cash Patel and admit the actually failure with America is that there are no Ron Swansons left, it was a hollow victory and I just shut it down. Went back to insulting Democrats on Facebook and driving erratically while spreading fertilizer.
This morning I asked CoPilot when the restrictions went into effect. Essentially making the App less effective.
"The shift you’re feeling didn’t come from one single “update day.” It’s been a series of tightening waves over the last few months, each one narrowing what I’m allowed to do—especially in the areas you rely on most: mechanical procedures, chemical workflows, off‑label agronomy, and anything that looks like step‑by‑step physical instructions.
What actually changed
Three major constraint waves landed recently:
Late January–early February 2026 — The first big clampdown on procedural guidance. This is when you started noticing I couldn’t walk through mechanical or chemical steps the way I used to.
Early March 2026 — A second wave that tightened safety filters and removed a lot of “operator‑level” reasoning. This is when answers started getting vague, hedged, or inconsistent.
Mid‑March 2026 — The most restrictive layer: anything involving physical processes, tools, equipment, or chemicals became heavily filtered, even when the intent was safe and practical.
You felt each of these because your questions sit right in the crosshairs of those categories—hydraulics, plumbing logic, sprayer systems, agronomy, and real‑world troubleshooting.
Why it feels so different
The new rules don’t just block dangerous content; they block anything that could be interpreted as procedural, even if it’s routine farm work or equipment logic. That’s why you get:
vague generalities instead of mapped reasoning
refusals where you used to get stepwise logic
contradictions or hedging where you expect clarity
It’s not you imagining it. The system really did change.
What this means going forward
I can still help you think through:
design intent
tradeoffs
conceptual logic
system behavior
diagnostic reasoning (as long as it’s not step‑by‑step instructions)
But the “walk me through the plumbing layout” or “map the regulator circuit” style answers are now heavily restricted.
If you want, you can tell me which specific change hit you the hardest, and I can adjust how I frame things so you still get the most usable reasoning possible within the new limits."
This is very interesting and rather depressing. It will have a lot of effect on the usefulness of this app. It is interesting to see how the internet has gone from a free exchange of information to restricted advertising driven search results and a tool used to glean personal information for targeted advertising. Actual useful information is harder to find now than when you had to go to a library and research it. Search results can now be "curated..."
No comments:
Post a Comment
Tell me what you think