The most trusted and popular consumer complaints website
Explore your opportunities! Create an account or Sign In

Daryl Devdutt / Fake Resume

1 United States

FAKE RESUME...Shame on him

Daryl Devdutt
Sr. IBM DataStage Consultant

SUMMARY:
 Seven Plus (7+) Years of experience in the field of Information Technology with solid experience in Software Analysis, Design and Development, Internet Technologies, Client/Server business systems, Data warehousing and application development
 Five Plus (5+) Years of extensive experience with Data Warehousing and Business Intelligence applications using IBM Information Server 8.0.1, Ascential DataStage 7.5/7.1/6.0/5.2/XE (Administrator, Manager, Designer, Director, MetaStage, Parallel Extender/Orchestrate, Quality Stage (Integrity), MetaRecon, XE/390), ETL, Trillium, First Logic, Data mart, OLAP and OLTP
 Five Plus (5+) Years of extensive experience in Data Modeling using Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimensions Tables, Physical and Logical Data Modeling and ERwin 4.5/4.0/3.5/3.0, Enterprise Database Integration and Management among Sybase, Oracle, MS Access, Teradata, MS SQL Server and DB2, Database Design (Physical and Logical), Data Administration and Maintenance, Application Support, Performance Tuning, Optimization
 Five Plus (5+) Years of extensive Business Intelligence Experience in Business Objects XI/6.5/6.0/5.1/4.x, BO Developer Suite (Supervisor, Designer, Web Intelligence (Infoview), Broadcast Agent), Cognos Impromptu 7.0/6.x (Impromptu, PowerPlay, Transformer, Impromptu Web Reports (IWR), Upfront, PowerPlay Enterprise Server)
 Seven Plus (7+) Years of experience using Oracle 10g/9i/8i/8.x/7.x, DB2 9.1/8.0/7.0, Teradata V2R6/V2R5/V2R4/V2R3, MS SQL Server 2000/7.0/6.5, Sybase 12.x/11.x, MS Access 7.0/2000, SQL, VB 6.0/5.0, ASP, XML, PL/SQL, SQL*Plus, SQL*Loader and Developer 2000, Win 3.x/95/98/2000/XP, Win NT 4.0 and Sun Solaris 2.X

CERTIFICATIONS:
• IBM Certified Solution Developer – WebSphere IIS DataStage Enterprise Edition V7.5

TECHNICAL SKILLS:

ETL Tools IBM Information Server 8.0.1, Ascential DataStage 7.5/7.0/6.0/5.2,
MetaRecon, Integrity, Data Mining, OLAP, OLTP
Data Modeling
Tools Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling,
Snow-Flake Modeling, FACT and Dimensions Tables, Conceptual,
Logical and Physical Data Modeling, ERwin 4.5/4.0/3.5.2/3.x
Data Cleansing First Logic, Trillium
Reporting Tools Data Reports, Seagate Crystal Reports 6.x/5.x/4.x/3.x, Cognos, SQR 3.x,
Developer 2000 (Forms 4.5/5.0, Reports 2.5/3.0), Business Objects
MS Access Reports
Databases Oracle 10g/9i/8i/8.0/7.3, UDB DB2, Sybase, SQL Server 12.0/11.0,
MS SQL Server 6.5/7.0, Teradata V2R6/V2R5/V2R4, ISQL, TSQL,
MS Access 7.0/97/2000
Servers Oracle 9iAS, MTS, IIS, Apache, Tomcat, MS Site Server
Languages C/C++, C#, VB Script, JAVA Script, HTML, UML, SQL, T-SQL
and PL/SQL, SQL*Plus, SQL*Loader
Operating Systems MVS OS, IBM 390/370, Windows NT 4.0/2000/XP, Sun Solaris, Sun OS,
HP-UX, SCO Unix, Win 95/98/3.1, MS DOS 6.22, Sun Sparc,
HP-9000, IBM AIX 5.3

PROFESSIONAL EXPERIENCE:

American Express, Phoenix, AZ Mar 2008 - Current
Sr. DataStage Developer

American Express is a leading global payments network and travel company. The goal of this project is to develop a Data Warehouse and then Data Marts that will help the managements’ need for financial information and aid in decision making.

Responsibilities:
• Actively involved in all phases of the project that included Requirement Analysis, Client Interaction, Design, Coding, Testing and Support
• Actively participated in team meetings and mapping sessions to understand the business requirements
• Used CFF stage to read and write mainframe files
• Extensively used CFF stage to extract multiple record type files and split the file based on record type for further processing.
• Used DB2 Enterprise stage to load data into the DB2 tables.
• Worked with other team members to design jobs that can be used across the project and result in minimum maintenance
• Used the state file in the Transformer stage to keep track of the surrogate keys.
• Developed Parameter sets that would reduce the time needed to add project level parameters to various jobs and job sequences
• Used the Change Data Capture stage to perform delta processing.
• Minimized the use of transformer stages and used Modify stage where possible while building jobs to improve performance
• Used stage variables to keep track of data in the source that came in various sets of batches.
• Created Recovery and Backup process to store the Surrogate key-Natural key combination datasets in flat files and recover them if needed.
• Used before job subroutines to strip the header records in the mainframe files
• Developed DataStage jobs using various stages to Extract, Transform and Load data into the DB2 tables
• Developed Multiple Instance jobs so as to reuse the same job multiple times in the job sequence
• Reduced the process run time by performance tuning DataStage jobs, optimizing system resources and incorporating parallelism
• Developed DataStage jobs that can handle huge volumes of data.
• Used DataStage Director to monitor performance statistics of jobs and unlock jobs when required. Also used Information Server Web Console to unlock jobs.
• Used DataStage Designer client for importing source and target table definitions, importing and exporting jobs and creating new job categories
• Imported Mainframe Cobol copybooks using DataStage Designer Client
• Used DataStage Administrator to set environment level parameters
• Migrated jobs from Development Server to the QA Server
• Developed Job Sequences to aid in automation and job control
• Developed unit test cases and performed unit testing of all DataStage jobs and job sequences

Environment: IBM Information Server 8.0.1 (WebSphere DataStage and QualityStage), IBM AIX 5.3, IBM DB2 9.1, TOAD for DB2, DB2 Control Center, MQC test director, Tumbleweed, Control-M, Micro Strategy 8, Windows XP, MS Visio

Ashland Inc., Dublin, OH Oct 2007 – Feb 2007
Sr. DataStage Developer

Ashland Inc. is a FORTUNE 500, diversified chemical company that provides innovative products, services and solutions to customers around the world. It operates in the United States and in more than 100 countries across the world. The objective of this project was to prepare data to load as a part of the ongoing process of getting all their acquisitions into SAP.

Responsibilities:
• Actively involved in all phases of the project that included Requirement Analysis, Client Interaction, Design, Coding, Testing and Support
• Actively participated in team and business meetings to understand the business requirements
• Worked closely with the team members and business team members to ensure that the project is running as per schedule
• Worked with DataStage NLS settings in order to be able to read and write Chinese data in DataStage
• Worked on Code page settings and NLS settings to ensure that the Chinese data extracted and loaded into Oracle database is accurate
• Modified DataStage jobs to extract and load Chinese data into SAP
• Parameterized NLS Settings for the existing jobs to ensure that the settings can be changed at run time, if needed
• Developed DataStage jobs using various stages to Extract, Transform and Load data into the Oracle database
• Used the SAP R/3 plug-in for DataStage to extract data from SAP
• Used DataStage Director to monitor performance statistics of jobs and unlock jobs when required
• Used DataStage Manager for importing source and target database schemas, importing and exporting jobs and creating new job categories
• Used DataStage Administrator to set environment level parameters
• Developed Job Sequences to aid in automation and job control
• Developed unit test cases and performed unit testing of all DataStage jobs and job sequences
• Provided production support to ensure smooth running of a workshop which was attended by people from all over the world
• Worked on existing DataStage jobs to improve performance
• Reduced the run time for some processes by performance tuning DataStage jobs, optimizing system resources and incorporating parallelism
• Achieved over 60% reduction in run time of certain DataStage jobs

Environment: IBM WebSphere DataStage 7.5.1, Parallel Extender, Quality Stage, IBM AIX 5.3, Oracle DTA, SAP R/3, Oracle 9i, TOAD for Oracle, Visual SourceSafe, Windows XP, MS Visio.

The Travelers Companies, Inc., Hartford, CT Feb 2007 – Sept 2007
Sr. DataStage Developer

The Travelers Companies is the second largest underwriter of Commercial Property, Casualty and Personal Insurance in the United States. The main objective of this project was to develop a DM to support the Regional Presidents’ need for management information.

Responsibilities:
• Actively involved in all phases of the project that included Requirement Analysis, Client Interaction, Design, Coding, Testing, Support and Documentation
• Actively participated in team meetings and business meetings to understand the business requirements
• Worked closely with the Business Analyst and the data modeler during the development phase to get a clear understanding of the business rules and data mapping
• Designed and Developed DataStage jobs to Extract, Transform and Load data into the Teradata database
• Developed parallel jobs using various stages like Transformer, Sort, Remove Duplicate, Funnel, Data set, Peek, Sequential file, Teradata Enterprise Stages
• Prepared the Technical Design Documents and Design Strategies for each of the source systems
• Used stage variables for source validation, to capture rejects and to indicate the reason for the record being rejected
• Design and Development of UNIX Korn Shell Scripts for integration with ETL Controls and to automate data load processes to the target Data warehouse
• Used DataStage Director to monitor jobs and view logs to debug jobs
• Used DataStage Manager for importing source and target database schemas, importing and exporting jobs and creating new job categories
• Used DataStage Administrator to set environment level parameters
• Worked with Teradata utilities like Fast Load, Multi Load and developed BTEQ scripts
• Developed unit test cases and performed unit testing of all DataStage jobs, UNIX scripts and BTEQ scripts
• Extensively used the MQC test director to log all the unit test cases
• Assisted in preparing the Capacity Plan for the project to ensure that the project does not run into resource issues
• Assisted in performing cause and effect analysis for certain critical issues to help resolve them at the earliest to ensure that the project meets its deliverables
• Used Maestro to schedule jobs
• Assisted in preparing Maestro schedules and documentation
• Involved in Teradata V2R6 database administration (DBA) activities on the Development server.
• Extensively used the MKS Version Control tool
• Involved in the preparation of the Deployment Guide that assists in ensuring that all the required groundwork is done when moving from a lower environment to higher environments
• Participated in the review of Technical Design Document and Design Strategies
• Prepared the run document to assist the Production Support team, it contains almost all possible cases of process failure and the appropriate steps that need to be performed to get the process back to running state
• Performed Knowledge Transfer to the Production Support team

Environment: IBM WebSphere DataStage 7.5.1, Parallel Extender, IBM AIX 5.3, Cognos 8.1/8.2, Teradata V2R6, MKS Integrity Client, MQC test director, Tivoli Maestro, Windows XP, MS Visio

Allstate, IL Jan 2005 – Jan 2007
DataStage Developer

Allstate is one of the largest insurance companies in the US. The primary objective of the project is to develop Data Warehouse system making extensive use of Data marts. The objective is to extract data stored in different databases and load into Oracle system.

Responsibilities:
• Participated in all phases including Requirement Analysis, Client Interaction, Design, Coding, Testing, Support and Documentation
• Gathering and documenting requirements, requirements analysis, converting requirements into High Level Design Documents
• Conducting ETL team meetings and explaining the requirements to team members.
• Walking through Job designs, Code by team members, correcting if necessary
• Designing job templates to specify high-level framework approach
• Designed the Data Mart model with ERwin using Star Schema methodology
• Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin
• Designed and Customized data models for Data warehouse supporting data from multiple sources on real time
• Designed and developed DataStage jobs for Loading staging data from different sources like PeopleSoft DB, and SQL Server DB into Data Warehouse applying business rules which consists data loads, data cleansing, and data massaging
• Implemented Slowly Changing Dimension Model Type II, for all tables in DW
• Generated Lot of Sequencer jobs to call through AutoSys for these SCD to run in order
• Designed and developed lots of DataStage jobs to load into Dimension and Fact Tables of different Data Marts from extracting data from Data Warehouse tables
• Extensively used Pivot stage to pivot and un pivot the source data to achieve required table structures like converting data from columns into rows and rows into columns
• Extensively used all the stages in DataStage like Pivot, Hashed File, Folder, Container, Aggregator and Transformer
• Designed, developed and deployed DataStage Jobs and fine-tuned them by implementing best practices like converting using Hash Tables for look-ups and following Naming Standards, Parameterizing the variables from Global Environment to stage level
• Used Parallel Extender to run jobs on SMP systems
• Extensively used to Aggregator stage to sort and group the provider data and calculated the quartile ranking for the eligible providers
• Used Quality stages built in wizards for parsing and removing duplicate records
• Configured MetaStage Directory using MetaStage Administrator. Extensively used almost all of the stages & transforms of the DataStage for various types of Date Conversions
• Developing Shell scripts to automate file manipulation and data loading procedures

Environment: IBM WebSphere (Formerly Ascential) DataStage 7.5.1.A, DataStage Plug-in, Parallel Extender, Quality Stage, MetaStage, Erwin 4.5, Version Control, PL/SQL, Oracle 10g/9i, SQL Server 2000, DB2, PeopleSoft 7.5, Business Objects 6.5, Windows XP/2000, HP UNIX

HSBC-HOUSEHOLD, Chicago, IL Apr 2003 - Dec 2004
DataStage Developer

Responsibilities:
• Requirement Analysis, Client Interaction and User Interviews
• Extensively used Erwin for Logical / Physical data modeling and Dimensional Data Modeling and designed Star Schemas
• Created the (ER) Entity Relationship diagrams & maintaining corresponding documentation for corporate data dictionary with all attributes, table names and constraints.
• Designed and developed DataStage jobs for data loads and data cleansing
• Extensively used almost all the stages of DataStage (Hashed File, Universe, Unidata, Transformer etc.)
• Designed, developed and deployed DataStage jobs and fine–tuned them
• Extensively used Ascential DataStage Designer to perform complex mappings based on user specifications
• Creation of re-usable repository using the Ascential DataStage Manager.
• Extensively used Ascential DataStage Director and Shell scripting for scheduling the job to run, emailing production support for troubleshooting from log files
• Used DataStage Manager to import Metadata Table Definition.
• Developed jobs in Parallel Extender using different stages like Transformer, Join, Aggregator, Lookup, Source dataset, External Filter, Row Generator, Column Generator, FTP, Sorting, Expand, Funnel, Remove Duplicates, and Merge
• Performed one to one, one to many, many to one and outer join mappings
• Created DataStage jobs, batches and job sequences and tuned then for better performance optimized
• Developed Interfaces using UNIX Shell Scripts to automate the bulk load, update and Batch processes
• Design and development of UNIX Shell Scripts
• Worked on data conversions and data loads using PL/SQL and SQL*Loader
• Extensively worked on Ascential Quality Stage to clean up the data
• Used Parallel Extender for parallel processing of data extraction and transformation
• Utilized MetaStage Directory to source, share, stored and reconciled a comprehensive spectrum of Meta data
• Used Transformer stage for transformations and Aggregator for aggregations
• Used Hashed file DataStage as a reference table based on a single key
• Used ODBC stage to extract data from data sources
• Defined source to target field jobs with DataStage complete source system profiling.
• Extracted huge volumes (terabytes) of data from legacy systems and uploaded into oracle using SQL*Loader and PL/SQL
• Extensively made use of Business Intelligence tool Business Objects for report generation.
• Designed and developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing
• Performance tuning of PL/SQL Scripts
• Extracted from and loaded data into databases such as DB2, SQL Server, Oracle, Mainframe flat files

Environment: Ascential DataStage 6.0/5.2, Parallel Extender, Quality Stage, MetaStage, Erwin 3.5.2, Business Objects 6.0, PL/SQL, COBOL, Oracle 9i/8i, DB2 UDB 7.0, SQL Server 2000, Teradata V2R4, SAP R/3, SAP PACK, Windows XP, HP-UX, AIX 4.3.3.

AT&T, San Ramon, CA Feb 2002 - March 2003
Data Analyst and ETL Consultant

AT&T has been for more than 125 years, been known for unparalleled quality and reliability in communications. Backed by the research & developed capabilities of AT&T labs, the company is a global leader in local, long distance, Internet and transaction-based voice and data services.

Responsibilities:
• Analyzed the project database requirements from the user
• Arrived at the querying criteria and usage of the queries with the aggregations
• Defined aggregations required and granularity available and required in DW
• Defined the partitioning of the data
• Collected the information about different entities and attributes
• Studied the existing ODS by reverse engineering into Erwin
• Normalized the entities and demoralized for DW purposes
• Defined the PKs and FKs for the entities
• Defined the relationships and cardinalities between different entities
• Created super type and sub-type entities and Category entities
• Defined the domains for the attributes and Validation rules
• Created logical schema using Erwin 4.x and also created the Dimension Modeling for building the Cubes
• Designed DataStage ETL jobs for extracting data from heterogeneous source systems, transform and finally load into the Data Marts
• Identify source systems, their connectivity, related tables and fields and ensure data suitably for mapping
• Generated Surrogate IDs for the dimensions in the fact table for indexed and faster access of data
• Created surrogate key tables to map the required dimensions
• Created hash tables used for referential integrity and/or otherwise while transforming the data representing valid information
• Enabled Proper selection of the hash table design parameters for faster table look-up
• Created re-usable components using shared containers for local use or shared use
• Imported and exported repositories across projects
• Used Aggregator stages to sum the key performance indicators used in decision support systems
• Tuned DataStage transformations and jobs to enhance their performance
• Wrote PL/SQL statement and stored procedures in Oracle for extracting and writing data
• Used DataStage Director to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on an ad hoc or scheduled basis
• Created job schedules to automate the ETL process
• Prepared the documentation of Data Acquisition and Interface System Design

Environment: Ascential DataStage 5.2/3.7(Designer, Director, Manager), MetaStage, ERwin 4.0, Windows NT 4.0, UNIX, Oracle 7, PL/SQL, DOS and UNIX Sequential Files, MS Access, XML Files

Datacon’s Pvt. Ltd, Bangalore, India May 2000 - Dec 2001
Programmer

The system issues computerized registration of customer accounts and view the accounts online and submitting mails for clarifications. It gives online information of the customers account details. Strategic Decision Support System providing complete Transactional Information of all types of accounts of their Branches Nationwide. Keeps a Track of the account related information

Responsibilities:
• Involved in designing the database model
• Designed and developed forms for the user interface using Oracle Forms
• Wrote PL/SQL Scripts to create alter and drop database objects likes tables, views, sequences, procedures and functions. Wrote stored procedures and functions to retrieve the data from the database using PL/SQL
• Different database triggers Oracle 7.0 with SQL*Plus, PL/SQL in both UNIX and WINDOWS environment
• Triggers were created and stored in the database and fired off when contents of database were changed
• Involved in performance tuning of the application
• Designed and developed matrix reports using Oracle Reports for the analysis of the data

Environment: Oracle 7.0, UNIX and Windows

Da

Post your comment