Compared to other roles, DataOps is relatively new in the data industry.
DataOps involves knowing what normal look like. Does your data source normally supply a thousand records an hour, but it suddenly supplies only ten records an hour?
Everything may be running operationally, and data is flowing as it should, but a change in volume, frequency or even distribution of data records themselves can all have an effect on the system.
Training Data Science models on unhealthy data can have a negative effect, stopping this data from being applied to your machine learning algorithms could be critical.
Our DataOps Engineers will identify the key measures required to understand if a data source, data integration or data transformation process is operating normally. They will perform the analysis to understand what normal looks like and set up the required alerting rules to highlight any anomalies.
Any identified issues will be classed as a new data source, following a normal Data Engineering process so that they can be referenced by the downstream process that depends on that data.
Our DataOps Engineers can work closely with:
DevOps to define metrics and alerts within an existing system
Data Engineering to ensure that the relevant and important metrics are being produced
Business Intelligence Engineers to surface data system health into your Data Warehouse and Reports
We have lots of data and information on the transport network and the challenge is capturing all this information, the dependencies and getting insight from it. When we don’t have to worry about technology, infrastructure, storage or performance, it means we can concentrate and focus on getting insight to improve the transport network and travel of people in Greater Manchester.
Malcolm Lowe, Head of IT at Transport for Greater Manchester
Contact us today to learn more.