What should the management team prepare themselves for when moving Java & .Net applications to the web?
Recently our largest client's Chief Information Officer (CIO) invited for a supplier briefing and gave his view that they have so many legacy systems that are costly to maintain, generally incompatible or otherwise not "future proof", that the only way to clean it up without internal political bias, is to outsource it to a subcontractor.
The CIO's view is that "We are not a software company. period!"
The subcontractor will schedule every update, patch, etc. and bill it to the applications cost-center. So obviously this will give a lot of transparency, which the shareholders love and support the CIO on.
The CIO's final pep-talk (and policy statement I assume) was that "in all the cases where the suppliers move their applications to the cloud there will be no need to use the outsourced subcontractor, as his company is happy to embrace this shift."
Discussions about security and risk were wiped politely out of the forum, as the CIO's view was that this is something the application provider is responsible for. Why should his company pay for patches that are derived from bad coding practice?
Rightfully justified or not - I don't see the trend that the CIO presents, as reversing. And it makes me wonder - given that web-applications is not our core strength - what is the best practice for transferring "old applications" to a world of web-applications?
I'm not a programmer nor in the executive team, but I need to maintain integrity of all three parties: The programmers, Our management & the users with the client.
With all the "unknown unknowns" I fear the product management team is up for a tough ride, so I might as well ask others who have been through this process before:
What should the product management teams prepare themselves for?
To the moderators: Before asking this question I was a bit worried I might get flamed asking this, but I found this one so inspiring, that I thought it would be worth a shot.
Recently our largest client's Chief Information Officer (CIO) invited for a supplier briefing and gave his view that they have so many legacy systems that are costly to maintain, generally incompatible or otherwise not "future proof", that the only way to clean it up without internal political bias, is to outsource it to a subcontractor.
The CIO's view is that "We are not a software company. period!"
The subcontractor will schedule every update, patch, etc. and bill it to the applications cost-center. So obviously this will give a lot of transparency, which the shareholders love and support the CIO on.
The CIO's final pep-talk (and policy statement I assume) was that "in all the cases where the suppliers move their applications to the cloud there will be no need to use the outsourced subcontractor, as his company is happy to embrace this shift."
Discussions about security and risk were wiped politely out of the forum, as the CIO's view was that this is something the application provider is responsible for. Why should his company pay for patches that are derived from bad coding practice?
Rightfully justified or not - I don't see the trend that the CIO presents, as reversing. And it makes me wonder - given that web-applications is not our core strength - what is the best practice for transferring "old applications" to a world of web-applications?
I'm not a programmer nor in the executive team, but I need to maintain integrity of all three parties: The programmers, Our management & the users with the client.
With all the "unknown unknowns" I fear the product management team is up for a tough ride, so I might as well ask others who have been through this process before:
What should the product management teams prepare themselves for?
To the moderators: Before asking this question I was a bit worried I might get flamed asking this, but I found this one so inspiring, that I thought it would be worth a shot.
No comments:
Post a Comment