当前位置:网站首页>Relationship between business policies, business rules, business processes and business master data - modern analysis
Relationship between business policies, business rules, business processes and business master data - modern analysis
2020-11-06 01:19:00 【On jdon】
Data quality is critical to the proper operation of an organization's systems . In most countries / region , There is a legal obligation to ensure the system ( Especially the financial system ) The quality of data in China remains at a high level .
for example , APRA's [APRA] A guide to prudent practice CPG235“ Managing data risk ” The first 51 Stipulated in article :
Data validation is the evaluation of data against business rules , To determine if it is suitable for further processing . It is a set of key control measures to ensure that the data meets the quality requirements . |
In the life of an organization , Sometimes there is a lack of data quality or evidence of data quality , And that could rise to the point where there is a threat , for example :
- Poor data quality can hurt customers , Especially when it becomes public knowledge .
- Regulators issue enforcement measures .
- Engage in M & A activities .
- Migrate core systems .
- Poor data quality hinders an enterprise's ability to respond to compelling market forces in a timely manner .
that , What is data quality ?
Data quality is a measure of how well data meets business rules :
- If the data is from an external source , Then determine its effectiveness ; or ,
- If the data is internally exported , Then confirm its correctness .
therefore , If you don't have standards that can be used immediately as rules , You can't measure data quality ; If you don't have “ The main rule ”, You can't have “ Master data ”.
however , No organization that we know has implemented in any practical sense “ The main rule ” The concept of .
Business rules take precedence over data and business processes
Ten years ago “ The beast of need and complexity ” Published in 《 Modern analysts 》 On , Its correlation has been exceeded so far 40,000 The download confirmed that . This paper takes requirements specification as an important weak link in the traditional system development life cycle . It introduces a new approach to requirements collection and specification , The method is based on the analysis of business strategy , Rules to identify core business entities and manage their respective data state changes . These rules are then captured and described as “ Decision model ”, To implement in the form of executable files in front-line systems .(banq notes :DDD Models belong to one of them )
And then ( And still ) The popular data and process centric approach to requirements is the opposite , This popular approach typically positions rules as dependencies on data or processes , It's not the highest requirement in itself . The article claims that rules are more demanding than data and processes , Because we Data and process requirements can be derived from rules , But it can't be the opposite .
Business rules are the implementation of business policies , Business policies also inherently define business data quality : If the data comes in , Then verify the data ; Or if the output data , Then make the data . In either case , Rules are the essence of data quality . For the sake of clarity , The rules to implement the policy must be executable ; An unenforceable rule is a description of the rule , Not the implementation of rules ; If the implementation doesn't execute , Explanation can't be “ The source of truth ”.
In the ten years since the publication of this article , Our business is deeply involved in the audit of legacy systems , Repair and migration , These businesses cover pensions ( Millions of member accounts ), insurance ( Millions of policies ) And the payroll ( Dozens of first tier corporate and government paychecks ).
Our audit , Remediation and migration activities cover the past 50 Most of the major technology platforms related over the years , It includes many technical platforms that predate relational databases . About these platforms , We've recalculated and corrected the trace back to 35 Personal accounts for .
Master data management
Master data is a concept as old as computing itself [2]. It can be briefly described as a computer data list .Gartner A more formal definition of , As shown below :
“ Master data management (MDM) It's a technology-based discipline , Business and IT Joint efforts , To ensure the consistency of enterprise official sharing master data assets , accuracy , Management power , Semantic consistency and accountability . Master data is a consistent and unified set of identifiers and extended attributes , Describes the core entity of the enterprise , Including customers , Potential customers , citizens , supplier , Site , Hierarchy and chart of accounts .”
actually , As master data management expects , Authoritative classification and management of data is usually limited to early systems Cobol Copybook , Or a data definition language for relational databases and similar systems [DDL]. In either case , We can all use fully automated methods to extract data definitions and data itself within hours . There are dozens of systems involved in our business 50 Mid year , We have never come across reliable and accurate “ Master data ” describe .
Whether it's effective or tried in other ways “ Master data management ”, There is an elephant in the room – Even a complete and authoritative “ A consistent and uniform set of identifiers and extended attributes describing the core entities of an enterprise ”, Gartner The recommendations don't tell us how the data fits into the business strategy for managing data . Except for the name , Data types and their relative position in the classification system , We have nothing .
for example , A project we just completed generated an ontology from multiple different source systems - With a final “ surname ” and “ surname ” As a human attribute . If you don't look at the basic code , There is no way to distinguish between these two properties , These two attributes sometimes have different values for the same person ( Tips , The usage is different !)
Master rule management
If there's no equally powerful “ Master rule management ”, So master data management is not feasible , And master rule management is not a concept we found in practice .
Whenever we audit , Repair and / Or migration , The rules we need to find will enable us to understand the overall data , Especially its quality . We think , Data quality is actually the degree to which data conforms to its rules ( That is, the validation rules of input data and the calculation rules of derived data ) To measure . Rule analysis takes a lot of time and effort . When we just look at the data in isolation , There's usually a starting point , Can be read through DDL Or other file system definitions to automatically get a starting point , But that's not the case with the rules . It's always been the first principle . All the data can tell us , There must be some rules somewhere !
Whether we can extract rules automatically as we process data ?
The answer is No , Although we do have the tools to help with this process . But more importantly , Fully automatic extraction of rules cannot solve the problem , It can't even solve the problem !
Effective rule management was not implemented at the beginning of the system . result , For a long time , The whole rule has grown organically , There is no formal structure - Like weeds, not gardens , It's usually spread across multiple systems and manual processes . The diversity of rules usually means , Achieve the same rule intent in different or even conflicting ways . Rules are often applied to multiple systems in an ambiguous order . The end result is , The rules we extract from the source system are often normalized , Conflict , Incomplete , Sometimes it's even wrong .
Besides , Rules usually have little or no basis . The context of the rules , Intention , Reasons and recognition are rarely documented , Even as time goes by , Oral storytelling is also lost . The definition of correctness becomes elusive .
Rules have never been seen as first-order requirements from the beginning , So it's never been defined “ Rule development lifecycle ”. In short , They're not officially , Existing in an orderly manner , Now it is necessary to infer transitively their actual existence by looking for their impact on the data .
therefore , Even if we want to extract rules automatically , They still need to be rigorously analyzed , Normalization and refactoring process , Then test , Finally, it's official ( again ) approval . in other words , We need to backfill “ Rule development lifecycle ” To promote rules to use “ The main rule ” The level required , And that can't be done by automatic extraction , Because the required rule metadata doesn't exist at all .
stay Sapiens In a short and eye-catching article of , The author outlines the failures of many companies , These failures are ultimately due to “ Because of data ignorance , Decentralized systems and the inability to monitor their own processes ”. From the same article Sapiens article :“ Moody's recently examined the causes of the collapse of insurance companies , One of the main reasons is that some insurance companies don't understand their pricing model . Moody's said , To avoid “ Selling a large number of policies at an inappropriate low price , Insurance companies will need to have systems and controls in place , In order to quickly identify and solve the situation that the pricing algorithm underestimates the risk ”.
All of this can be done through implementation “ Master rule management ” To relieve .
Our conclusion : Whether it's due to regulatory pressure , Mergers and acquisitions , There is a need to improve business governance to reduce risk , The implementation of “ General rule management ” Sooner or later it will be a priority . By then , It will not be optional .
How to extract business rules ?
Rules themselves are not just important artifacts ; They are the only way you can claim to understand data and its downstream processes correctly . To understand data correctly is to run the system correctly , Thus fulfilling legal and professional obligations and avoiding the key points outlined above .
Build the master rule for default , It is necessary to be able to put “ The main rule ” As a mirror image of the implemented system rules , To implement every entity in the system (banq notes : This is a DDD The reason why physical objects should be congested objects , Behavior rules are encapsulated in methods of objects ), In the past , Now? , Every value of each entity . And the future . Whether it's auditing , Remedy or migration , Must independently verify the master rule . These external verifications “ The main rule ” Parallel processing with the system's own rules , To get real-time mirror data , Then the data is incrementally tested in the stream . in other words , We create two values for each rule derived property , And then calculate the difference between them ( If there is ).
The article introduces their unique method , Click on the title to see the original text
版权声明
本文为[On jdon]所创,转载请带上原文链接,感谢
边栏推荐
- After brushing leetcode's linked list topic, I found a secret!
- Process analysis of Python authentication mechanism based on JWT
- Skywalking series blog 5-apm-customize-enhance-plugin
- (2)ASP.NET Core3.1 Ocelot路由
- How to get started with new HTML5 (2)
- Serilog原始碼解析——使用方法
- 直播预告 | 微服务架构学习系列直播第三期
- Elasticsearch 第六篇:聚合統計查詢
- 【效能優化】納尼?記憶體又溢位了?!是時候總結一波了!!
- Deep understanding of common methods of JS array
猜你喜欢
PHPSHE 短信插件说明
Use of vuepress
条码生成软件如何隐藏部分条码文字
git rebase的時候捅婁子了,怎麼辦?線上等……
Filecoin的经济模型与未来价值是如何支撑FIL币价格破千的
Face to face Manual Chapter 16: explanation and implementation of fair lock of code peasant association lock and reentrantlock
如何将数据变成资产?吸引数据科学家
人工智能学什么课程?它将替代人类工作?
中国提出的AI方法影响越来越大,天大等从大量文献中挖掘AI发展规律
Filecoin最新动态 完成重大升级 已实现四大项目进展!
随机推荐
Vuejs development specification
Details of dapr implementing distributed stateful service
Didi elasticsearch cluster cross version upgrade and platform reconfiguration
Just now, I popularized two unique skills of login to Xuemei
Analysis of react high order components
(1) ASP.NET Introduction to core3.1 Ocelot
ipfs正舵者Filecoin落地正当时 FIL币价格破千来了
High availability cluster deployment of jumpserver: (6) deployment of SSH agent module Koko and implementation of system service management
A debate on whether flv should support hevc
合约交易系统开发|智能合约交易平台搭建
Filecoin的经济模型与未来价值是如何支撑FIL币价格破千的
Troubleshooting and summary of JVM Metaspace memory overflow
条码生成软件如何隐藏部分条码文字
有关PDF417条码码制的结构介绍
Jmeter——ForEach Controller&Loop Controller
Arrangement of basic knowledge points
3分钟读懂Wi-Fi 6于Wi-Fi 5的优势
Python3 e-learning case 4: writing web proxy
Architecture article collection
事半功倍:在没有机柜的情况下实现自动化