As my 25th winter in Chicago approaches (and after the 2016 elections),, I attempt to cheer myself up with jokes. Programmer jokes.
Jokes that largely have to do with being too lazy to verify assumptions or prove claims before making decisions that turn out to be really bad decision.
Here's the most recent "joke" I heard from a developer, via an email back in July:
We just encountered a PL/SQL performance problem after migrating a database from 10g to 11g and I tought it might interest you. I isolated the problem using the 10046 traces and DBMS_PROFILER. So I was able to reproduce the problem with a very simple PL/SQL testcase, but I cannot explain it.
He then pasted in 439 lines of code and output. So this is the "very simple" testcase? It might be, but still I wrote back:
"It may be simple, but it's long and I'd appreciate it if you would summarize for me what you believe you have discovered."
This way, at least, I didn't have to feel the least bit obligated to tangle with his problem until he replied.
But his response was even better and more entertaining than I'd hoped:
I just found out the cause of the strange behavior...Somebody had set the PLSQL_OPTIMIZE_LEVEL to 1 instead of 2 at the database level so that a massive recompilation of all packages would go faster.
When the optimization was set back to 2, the performance problem he noticed earlier disappeared. Hurray!
So now let's go back and parse that paragraph:
"Somebody had set" the optimization level - I sure hope that somebody else knows exactly who that original somebody is. You'd like to think that there aren't too many people with the authority to change the default optimization level on a database instance.
"…so that a massive recompilation of all packages would go faster."
Ah, that's just too delicious! Again, an element of truth that goes a loooong way towards mucking up application performance.
It is true that the higher you set the PL/SQL optimization level, it takes longer for your code to compile, because the compiler is doing more work analyzing and applying transformations to your code to improve runtime performance.
Hey, let's shrink that paragraph down to its essence:
Spend more time compiling your code so that code runs faster for your users.
Sounds like an excellent tradeoff. We wait a little longer for code to compile (and would we even notice the difference? Unlikely.) and in return our users are happier.
Not for that DBA, though. Maybe he had a hot lunch date or had signed up for one of Tom Kyte's amazingly popular webinars and trainings and that was about to begin. At which point he might learn about how important it is to keep that optimization at least at 2.
Whatever the case, all he knew was that he needed to get that code complied fast.
I just hope it wasn't on a production instance.
Know Any Good "Jokes"?
I bet you've run into all sorts of hilariously truthy stories of developers or DBAs doing the most absurd things for the only slightly less absurd reasons. Please share them with me, either via comment below or by sending an email to steven dot feuerstein at oracle.com.
In the meantime, don't forget to set the optimization level to 1. Or maybe 0. That way your code will compile faster.
Giving you less time for that extra cup of coffee. Which is probably a good thing.
That was a joke, OK? Optimization level at 2 or higher, please!
Originally published in ODTUG's Confessions of a Quick and Dirty Programmer series
Comments
Post a Comment