Subscribe to DSC Newsletter

Daniel Kocis
Share on Facebook

Daniel Kocis's Friends

  • William J McKibbin
  • Ralph Winters
  • Jim Varriale |
  • Vincent Granville

Daniel Kocis's Page

Profile Information

Job Function:
Business Analytics, Predictive Modeling, Data Mining, Biostatistics, Marketing Databases, Operations Research, Quant, Vizualization, Statistical Consulting, Medical Statistics, Artificial Intelligence, Finance, SAS
Short Bio:
PROFESSIONAL SUMMARY- Develop data-sourcing and reporting processes that migrated LOB risk-reporting into a single group which oversaw auditing and regulatory data quality governance issues. Provided key components in the risk control production for this major money center bank, across several SAS 9.2 metadata information portals connected to Teradata, DB2, Oracle, SASPlex and SBIL enterprise repositories.

RECENT EXPERIENCE- 12/2009 – 4/2011 – A Major Money Center Bank

• Supporting Regulatory Enterprise Credit Risk Reporting –
Audited and optimize all legacy SQL-table creation code by developing a CASE-Condition DB with OCC compliant metrics, across all consumer credit products (Card, UNS, Auto, Small Business, International, MTG, and HE). Created an EG 4.2 reporting system of specific CASE-Condition and aliases that helped establish a “common coded definition” across the enterprise. These defined key metrics (outstanding balance, active/open accounts, APR-outstanding balances, and OCC indicators) were collected monthly across all consumer credit product origination and portfolio tables. Produced optimized re-writes of production SQL resulting in up to 80% reduction in runtime and spool space utilization.

Data-sourced all tables and produced dashboards for the BOD of the total risk exposure and credit utilization reporting enterprise wide geographic concentrations. Aggregating across all LOBS with multiple SLA’s, the current monthly total outstanding exposures used standardized CASE-Conditions listed above. Created several production run libraries that migrated all processes into EG4.2 projects.

Used EM6.2 “Rule-Based Technique” to investigate drivers of impaired mortgage accounts.

• Supporting portfolio credit performance trend tracking by LOB
Used credit bureau samples to define peer and total market segments of dealer based auto and specialty brokered loans. These were profiled by geographic concentration risk, calculating share of business with current estimated losses and volatility adjusted losses reported at an origination and portfolios level.

• Supporting Bank Risk Policies monitoring
Produced delinquency rates wedges for all enterprise credit risk policy and geographic concentration limits by tracking actual performance against portfolio target levels within domestic and international markets. Rolling historical MTD-QTD-YTD by account open date with total outstanding available and delinquent balances with reporting delivered on a monthly SLA to all LOB management.

• Supporting Bank Loan Quality and New Acquisitions.
Produced monthly source analysis of new credit card acquisitions by geographic and FICO bands. Current VS Previous Month Source Dashboard providing target geography and credit risk indicators measured against a set of credit policy levels.
LinkedIn Profile:
Networking, New Venture
The art and science of mathematical investigation begins with asking, “What kind of data do I have? “, then “How do I re-represent these raw numbers into something more meaningful?” Datum refers to a single metric, usually a number while data are a collection of numbers and alpha-numerics. Before selecting a tool and building models, the investigator needs to create an extended database from which to calculate their models. Preliminary questions include:

·Is it static (such as a one-time survey) or dynamic (periodic refreshes) data

·Do they exist as a single table or spread across several tables

·If they do exist across several tables, what is the hierarchy of each table?

This chapter addresses several issues and provides many different examples of creating-transforming raw variables into potential quantitative indicators and metric’s.

This chapter will show how to:

·Use streaming data for nuclear and physiological investigations,
·Create unique time-dependent hierarchies from flat data,
·Transform simple counting statistics to increase the number of new variables,
·Constrict confidence intervals with additional transformed variables,
·Create complex physiological ratios to predict heart failure in children,
·Use demo/geographic/transaction data that define high valued clients,
·Calculate financial filters to identify n-period ahead closing stock prices,
·Create “Discovery Paths” within a website that reconfigures with SEO,
·Capture meaningful metric’s from NCSA raw web logs,
·Use “relativistic coding” in time series analysis,
·Aggregate the dynamic quality of streaming data to create new variables
·Quantify text-mining from organic keyword searches on the web,
·Define “high-valued” or “target” segments
·Re-aggregate the hierarchy of data into meaningful segments
·Create dimensions for CRM based upon simple sign-in data
·Look at data and their transformations in new and different way

Comment Wall

You need to be a member of AnalyticBridge to add comments!

Join AnalyticBridge

  • No comments yet!

On Data Science Central

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service