Lawson’s Quality Crisis…The More Things Change…
June 6, 2007
Posted by on
A number of years ago, I wrote a LawsonGuru Letter article titled “Lawson’s Quality Crisis”. I really had high hopes that Lawson was turning the corner on these problems as part of their various initiatives (including their adoption of better CMM practices, the Xansa partnership, etc.)
In some ways, your Lawson “ownership experience” really has gotten better. In particular, the installation of CTPs and MSPs is greatly improved. Yet, despite Lawson’s new motto, “Simpler Is Better”, installation is still way too complicated.
But it’s the quality of the software that’s my topic here. Dean Hager stands up at CUE and user groups and proclaims that “this is the highest quality release Lawson’s ever done”. But it’s still shoddy. Sure, the CTPs may be easier to install. But when they break more than they fix, what’s the benefit? I’ve even seen some horrendous defects recently, which have corrupted data, deleted transactions, and scrambled GL entries.
When I look at the code that’s getting shipped (presumably, a lot of these problem CTPs are coming out of Lawson’s new offshore development group in Manila..), I see obvious bugs. I see programmer who don’t understand COBOL variable scope and structure. I see code that a junior COBOL developer would vomit over. Maybe COBOL is a dying skill, but it’s the lifeblood of Lawson’s products, and they need to better train their programmers and review their coding and QA practices. There’s a reason the “old” COBOL programmers did things “the right way”, using all those “passe” practices that were tossed out during the dot-com boom. It was called “quality assurance”.
Lawson’s quality crisis is not limited to COBOL. It’s evident in the number / frequency of LSF9 patches being released. But when I saw this one, I truly realized that it’s a crisis:
Has all that fancy Landmark nonsense caused Lawson to lose their focus on their core code?