Information Architecture Experience

For a leading provider of telemedicine, part of the core team that designed next generation platform for said company. Work included (a) selection of new toolsets for development and operations, (b) creation of an applications architecture, (c) creation of a data architecture, (d) creation of a security architecture, (e) creation of a reporting and information architecture, and (f) definition of gap assessment for both new functionality and impact on current operations from new technology deployment decisions. Primary area of focus was on data and information architecture. This included a complete review of the current data model and creation of a target conceptual model that was reviewed with the rest of the core team so that architecture decisions could not only be defined but tested; before team was ramped up and before design began. Approach here was to be both forward thinking and as much as possible to test assumptions early, both technical and business decisions. This proved to be a very sound tactic to both reduce risk and allow development to progress with many tacks in parallel.

For a leading provider of telemedicine, created company’s next generation data model, both logical and physical.  About one dozen workspaces were initially identified, e.g., party, member, account, product, provider, clinical, etc. an order was defined and forums were created to dive into each one. Resulting from each set of sessions were (a) logical data model, (b) data dictionary, (c) mappings document for migration from current state DB, (d) reference data, and (e) physical DDL. The resulting model (a) is built on standard data types, (b) utilizes master codes and party modeling constructs, (c) includes referential integrity through foreign key relationships, (d) supports polymorphic relations through object type constructs and is therefore quite extensible, easy to maintain, and high performance as it is mostly in third normal form.

For a leading provider of telemedicine, facilitated creation of a disaster recovery plan. This includes (a) creation of conceptual, logical, and physical network and server models, (b) managing inventory of servers, (c) creation of SOPs and SLAs, (d) creation of resumption and restoration plans, (e) improving monitoring and alert filtration, and (f) implementing internal processes for disaster and security audits. This is a young company that has grown to the point where its operations need to mature in order to continue to scale.  This process will help ready the company pass public SSAE 16 audits when required.

For a leading provider of health management services, moved into Management of the companies second generation ODS database. This included the following groups (a) Data Team: logical modeling, physical DDL, DML script generation, reference table Management) (b) PL/SQL Team: creation/maintenance/ performance tuning of all stored procedures, (c) ETL Team: creation/maintenance/performance tuning of all batch jobs, (d) SOA Team: creation/maintenance/performance tuning of all web services; this team has since evolved into a full-fledged, Java-based enterprise service bus (ESB) that is used for all data access and multi-tier communications.

For a leading provider of health management services, helped design and manage implementation of the companies first foray into Health Information Exchange (HIE).  This company was providing clinical analytics in realtime to a Regional Health Information Organization (RHIO) consisting of 18 provider organizations (hospitals and plans). Integration included integration of the following services (a) personal health record (member portal), (b) disease management platform (part of which were state sponsored programs for HIV and maturity), and (c) gaps in care rules engine.  Integration included creation and support for the following HIE constructs (a) HL7 v3 CCD (continuity of care documents based on C32 message structure), (b) XCPD management of patient EMPI (including PIF, PIX, and PDQ services), (c) HIE consent management, and (d) document exchange.

For a leading provider of health management services, helped design and manage implementation for a large medical network consisting of hospitals and practices. While leveraging products resulting from prior work, the new aspect here was consumption of 837 data from practice management systems.  As a result new constructs, mappings, and data structures had to be developed to support these pre-adjudicated claims. Work included (a) additional of functionality to the companies Care Management system to support provider functionality such as quality measures and disease registry a provider level, (b) creation of an organizational structure to define hospitals and practices with specific TINs and OIDs, (c) creation of a user master to define users of this Care Team product with lexical access to proper practice patients, (d) enhancement of the companies HIE Adapter to support mapping and processing of 837 claims data from practice management systems, (e) integration with a sister companies HIE engine for XCPD Management of patient EMPI, and (f) creation of support programs for monitoring quality of 837 submissions, including identifying omissions in data drops. In addition to 837 data was consumption of HL7 v2 lab data from hospitals.

For a leading provider of health management services, helped design and manage the company's first foray into creation of Accountable Care Organizations (ACO), specifically with one of the 34 health care networks selected for the CMS Pioneer program. This was a very complex project that was program managed with multiple work stream (a) Pioneer Measures: this work stream worked with the CMO (Chief Medical Officer) of the client to define quality measures for inclusion on the provider dashboard. The QMs were not just for Medicare patients, but agile enough to be used for all populations, (b) Clinical Workflows: this work stream worked with Case Managers of the client and was divided into 3 areas, outpatient care management, inpatient care management, utilization management, (c) Member Information: this work stream focused on analyzing and loading both payer (health plan) data as well as clinical (EMR) data, and (d) Tools & Technology: this work stream focused on the more technical topics such as EMR integration, portal integration, provider referrals, provider authorization, and so on. Mr. Hochron was co-lead for Member Information and an active participant in the other three. In addition, Mr. Hochron managed development of back end systems for data consumption, including user master data, payer plan data, and clinical EMR data.  Multiple EMR documents were consumed, including HL7 v2 ADT, lab, and transcription documents and HL7 v2.5 CCD documents from both hospitals and practices from many EMR systems including, Cerner, NextGen, Allscripts, Centricity, and Cloverleaf.

For one of the worlds largest insurance brokers, as part of the global enterprise architecture team, Mr. Hochron formed the global Information Architecture which included (a) deployment of a master name and address database for companies (that managed family tree structures), (b) a Data Warehousing strategy (that included conformed dimensions), (c) creation of canonical messages for all middleware implementations, (d) a corporate reference table repository for maintaining common reference data in one place, and (e) a Data Stewardship project (with the business) that initially focused on prospect/client definitions and rules.

For one of the worlds largest insurance brokers, Mr. Hochron architected a successor to the company’s master name and address database.  Problems with the prior system included: only linked to billing system, limited to clients, not aligned with end-to-end process, not implement able on a global basis, etc. The successor system that Mr. Hochron architected was: (a) built on a generic party model (e.g., prospects/clients, vendors, insurance carriers, Attorneys, additional named insured, etc., (b) based on a subscriber system model (to properly handle corporate actions via a mapping of keys), (c) globally deployed (as well as able to support overlapping geographies for different operating companies), (d) implemented such that all back end processing (to Dunn & Bradstreet) was fully mechanized, (e) able to support multiple peer groups for performing data repair and maintenance (across geographies), and (f) able to support by legal (primary) as well as mailing (secondary) company names/addresses.

For one of the worlds largest insurance brokers, Mr. Hochron defined a Data Warehouse strategy that was a blend of Inmon and Kimball; in other words is consists of a persistent normalized Data Warehouse as well as conformed dimensions for denormalized Data Marts.  Part of this process was the definition and mobilization of the proper teams needed to implement this solution. Most important were the creation of: (a) an integrated team (members from data modeling, ETL, DBA, reporting-both project and ad-hoc, architecture, data warehouse, and Business Analysts) that focused on definitions and structure of the conformed dimensions, (b) mobilization of a Data Stewardship program for consistent business definitions (with initial focus on prospect/client), and (c) working out the organizational issues for maintenance of common codes across applications.

For one of the worlds largest insurance brokers, Mr. Hochron defined and launched a process or maintaining Canonical Messages for the companies implementation of IBM’s WBI (formally referred to as MQ Series). Being in the insurance industry, the canonical messages were all ACORD-based (an independent standards organization that includes XML definitions for standard messages).  The technical solution was based on three master XML schemas that were at the foundation of every canonical message. A software layer was put above this to extract canonical messages for specific domains (e.g., party, policy, reference data, insurance carrier, invoice, etc.). A change management process was also defined such that requests from one system do not force cascading changes to other messaging constituents.

For one of the worlds largest insurance brokers, Mr. Hochron architected a Corporate Reference Data Repository (CRDR) as the global master for common reference codes.  These reference codes are at the heart of the information architecture defined above.  CRDR handles 4 types of reference data: (a) hierarchies, (b) groups, (c) master codes (a single table, segmented by categories, that houses standard decodes such as country codes, currency codes, status codes, etc.), and (d) special tables such as and office master that assigns a unique ID to each facility/office. CRDR data is maintained by Data Stewards and Data Custodians.  Once the data is ready for release, CRDR publishes the data as a WBI message.  Its subscribing systems listen to the proper CRDR topic for receipt and processing.

For a global money center retail and investment bank, as part of an enterprise architecture team, Mr. Hochron developed detailed information architecture for management of technology assets. The purpose of the Asset Management program was to (a) track asset inventory detail in order to support sister processes such as problem management, change management, configuration management, systems management, IMAC, etc., (b) track financials including purchase cost, maintenance cost, license information, etc., and (c) provide the proper detail for purpose of chargeback of those assets to the lines of business.  Specific work included (a) transactional and reporting requirements collection, (b) current state assessment, (c) development of a conceptual data model, (d) logical data model, (e) physical data model (for the server inventory scan program), and (f) creation of canonical mappings. Work also included an advisory role in (a) package selection, (b) creation of business rules, and (c) process creation.  One of the bigger challenges of this project was the alignment of reference tables throughout the bank.

For a global technology manufacturing and services company, Mr. Hochron was part of a small management consulting team that defined a methodology for valuation of Data Marts. The impetus for this project was to understand revenue contributions that are attributable to Marketing Data Marts.  The result was a methodology comprised of three components: (a) valuation metrics, (b) valuation model, and (c) valuation processes.  The resulting methodology defined (a) what was being measured, (b) why it was being measured, and (c) a framework for continuous improvement, as to better subsequent scores.

For a global telecommunications company, Mr. Hochron was the chief architect in a data-cleansing project to develop a common customer key. The project entailed creating a customer hierarchy on top of the existing account hierarchy. The purpose of the "clean" customer hierarchy was to enable Marketing to execute cross-sell and up-sell campaigns. Prior to this effort, account information was product-specific and the company did not have good exposure into which products a particular customer was purchasing. Together with outside information on share of wallet, the company is able to target one-to-one campaigns that have the potential to dramatically increase revenues.

For a global telecommunications company, Mr. Hochron was a core technical advisor (Enterprise Architect) to a multi-million dollar global effort for Common Data Management. This program had five workstreams: Organizational Hierarchy Alignment, Core Business Terminology, Subject Area Inventory, Data Discovery & Cleansing, and Data Management. The program had two main deliverables: (1) common business language (CBL), both in business terms and validated through an enterprise data model (the purpose of the CBL is to ensure common business definitions globally and alignment of those definitions to IT transactional systems Data Warehouse/Data Marts) and (2) clean data (whereby CBL helps define business rules necessary for high data quality).  In addition to being core technical advisor, Mr. Hochron, also managed the technical implementation team with overall responsibility for quality.

For a global stock exchange, Mr. Hochron led the creation of a Data Warehousing Strategy.  The purpose of this strategy was to define the (a) Data Warehousing Framework (characteristics, architecture, principles, transformation, and end user tools), (b) Support Processes (governance, data quality, development methodology), (c) Organizational Impact (roles and responsibilities, functional entities, and (d) Develop a Conceptual Data Model (in terms of subject areas).

For a global investment bank, Mr. Hochron led the creation of an architecture plan central around the introduction of a new CRM system.  The purpose of the project was to understand all application and data dependencies associated with the introduction of the CRM package. The project entailed defining target state applications and information architectures and mapping the gaps to the current state.  Select applications, and data stores, were targeted for sunsetting (decommissioning). A large part of the project was to define the information flow centered on the party (which included customer and employee) subject area. In all we defined 6 subject areas (that were directly affected by CRM) in detail.  All aspects of the target state were aligned to the client’s business strategies and objectives.