Why is it that corporate IT teams and process automation teams are so often miles apart, when their efforts are so interconnected? Not only at life sciences companies but everywhere? The World Batch Forum last month highlighted this problem, but a presentation by Mark Kovacs with the consulting firm EnteGreat at StratusWorld, the company's users group meeting, neatly summarized the problem. (Stratus, which focuses on maximizing server uptime for companies in a number of industries, also launched a new business at the event) Some IT practices that negatively impact Manufacturing
- Automated patch management to the MES or shop floor may stop production (“but it’s a critical patch from Microsoft, it should work justfine”).
- Strict standards that do not take into account unique needs of manufacturing (e.g. real-time series, time sensitivity, etc).
- Driving efficiencies via centralization (e.g. attempting to run process historians over the WAN – “What’s a scan rate and why is it measured in milliseconds?”)
- Access to HMIs and computers that control equipment is defined via group or users shared ID's and passwords.
- Use of home grown or non-supported applications as a “quick fix” (e.g.“Come on, how hard can it be to write VB code?”).
- May not take advantage of existing IT best-practices (e.g. “I know I have a back up my PLC code somewhere…” )