Configured XML Firewall loop back proxy to test all the configurations in multiple steps. Participated in the identification of key performance indicators and methods for data warehouse to assist with operational and strategic planning. Partitioned sources and used persistent cache for Lookup's to improve session performance. Data Analytics skills are major data analyst skills that make it possible for you to address problems by making decisions in the most appropriate way. Prepared test matrix, test data and test cases for SQA team. Data flows into a data warehouse from transactional systems, relational databases, and other sources, typically on a regular cadence.Business analysts, data engineers, data scientists, and decision makers access the data … Developed all initial load ETL utilizing PL/SQL and 3GL routines to load extract from quarterly CD Publications spanning 10 year period. Data warehouses are information driven. Developed strategies for data acquisitions, archive recovery and implementation of a database. Loaded the data into Simple Storage Service (S3) in the AWS Cloud. Worked along with Data Warehouse Architect and DBA's to design the ODS data model for reporting purposes. Designed and implemented MapReduce-based large-scale parallel relation-learning system. Responsible for delivery of multiple projects and managed revenue of over $18 MM. Worked on syncing Oracle RDBMS to Hadoop while retaining oracle as the main data store. Created pivot tables in Excel using data from various sources like SSRS and MS Access. Provided full support on Smart Link application and responsible to update feedback mail box as well as issue log. Received Power Center Developer Training from Informatica Corporation. Documented Informatica mappings, design and validation rules. Involved in analyzing source systems and designing the processes for Extracting Transforming and Loading the data to Teradata database. Used Impala connectivity from the User Interface(UI) and query the results usingImpalaQL. Used Sqoop to fetch the data from Oracle database and also send it back. Created Data Dictionary, EDW Stage load to Target Load process documents. Developed Perl Scripts for table creation. Implemented Flume custom Intercept to perform cleansing operation before moving data into HDFS. Created, maintained, and deployed cubes and packages for use through Cognos Connection using Power Play and Framework Manager. Designed and Developed programs to bring Customer Relational data into Data Warehouse. Worked on debugging, performance tuning and Analyzing data using Hadoop components Hive Pig. Communicated impact of database changes and other systems to business stakeholders and co-workers. Designed and implemented stored procedures views and other application database code objects. Developed SQL scripts using Spark for handling different data sets and verifying the performance over Map Reduce jobs. Coordinated with subject matter experts and the QA team for timing and sequencing launch. Accessed data from relational tables that are not sources in mapping using lookup transformation. Converted logical Database Design into Physical schema using the balanced normalization in 2nd and 3rd Normal form for highest performance. Data Warehouse Developers analyze, organize, store, retrieve, extract and load data as a means of staging, integrating, and accessing information. Designed lookup strategies using Hash file stage for data extraction from the source systems. Worked at different grocery stores since i was 14 years old. Developed and created logical and physical Database architecture utilizing ER-Win Data Modeler. Utilized PL/SQL, COGNOS, and TOAD for creating and maintaining ad-hoc reports. Developed various mappings to populate aggregate or summary tables called MDC tables in DB2. Developed necessary metadata repositories for all initiatives and ensure these meet current needs and provide flexibility for future growth. Developed Oozie workflow engine for job scheduling. Involved in identifying job dependencies to design workflow for Oozie & YARN resource management. Involved in analyzing the bugs, performance of PL/SQL Queries and provided solutions to improve the same. Converted HL7 format files claims to XML files using parser in B2B data transformation. Provided detail data analysis and report prototypes for Charter Corporate and all regions. Performed comprehensive Unit testing by comparing the Cognos reports against the database using SQL in Toad. Implemented in memory caching using bloom filters and block cache in HBase. Designed and developed OLAP business models and reports using PowerPlay to analyze budgets and cash flow for Asset Management group. Used flume to collect the entire web log from the online ad-servers and push into HDFS. Created covering index to avoid bookmark lookups and improve query performance. Developed UNIX shell scripts for scheduling the jobs and for automation of ETL processes. Used TOAD to run SQL queries and validate the data pulled in BO reports. Developed Oracle Stored Procedures, Functions and Packages to effectively incorporate Business rules. Involved in the unit testing prior to giving the code to QA. Developed documentation for the procedures. Deployed the project on Amazon EMR with S3 connectivity for setting a backup storage. Extracted, transformed, and loaded data into an Oracle database from DB2 sources. Performed testing to ensure data was converted properly and conversion programs were written correctly. Identified source systems, their connectivity, related tables and fields and ensured data consistency for mapping. Designed the ETL process flows to load the data into DB2 Data Mart from heterogeneous sources. Developed Pig program for loading and converting the data into parquet in the HadoopLake. Experienced in writing ANT and MAVEN scripts to build and deploy Java applications. Warehouse Worker [Intro Paragraph] The best warehouse associate job descriptions take 2-3 sentences to introduce your business, unique company culture, and working environment to prospective warehouse … No spam, just information that will help you build a resume that makes you feel relevant and well represented. Defined report layouts including report parameters and wrote queries for drill down reports as per client's requirements using SSRS 2016. Rolled out source system integration with Single Customer View/Geography using IBM MQ messages/Informatica. Designed and developed a high-volume data warehouse and OLAP cubes using SSAS. Make sure your email is correct, Integrated 5+ data source systems into Capital Markets Data Warehouse for Regulatory Reporting. Migrated 43 statistical reports COGNOS series 7 reports to a single COGNOS ReportNet report. Used SSAS to create cubes with various measures and dimensions for financial reports generation. Developed Reports using SQL Server Reporting Services (SSRS) and SSIS packages and designing ETL processes. Scheduled and monitored the process using Informatica Server Manager. Created technical design specifications based on business requirements. Involved in creating logical and physical database design-using Erwin. Managed analysis, design, coding and testing of ETL jobs for 7 Source Systems. Used Incremental Aggregation in the Aggregator transformation to make sure the measures for certain aggregate tables got calculated properly. Performed code reviews to check adherence to published coding standards and evaluate sustainability of code resulting in increased productivity. Used Torrent orchestrate is component based framework for ETL processes, parallel applications for running on massive parallel systems. Designed and build new dimensional data models and schema designs to improve accessibility, efficiency, and quality of data. Used Spark Data Frame API to process Structured and Semi Structured files and load them back into S3 Bucket. Worked with Development and QA teams to alter processes and implement testing to almost eliminate the occurrence of errors. Established custom software team to increase corporate revenue over 30%. Applied Slowly Changing Dimension and Dynamic lookup techniques. So also does … Collaborated in Extraction of OLAP data from SSAS using SSIS. Coordinated with business customers to gather business requirements. Designed and programmed on a new member billing system using Oracle 9i PL/SQL which replaced the legacy COBOL billing system. Worked on Production Server's on Amazon Cloud (EC2, EBS, S3, Lambda and Route53). Worked with AS400, JDEdwards and Island Pacific teams in resolving several data discrepancies. If you have certificates to back up your talent, spotlight them, too. Performed operations, integration and project responsibilities targeting risk management with Infrastructure Access and Security Management team (IASM). Used DataStage custom routines, shared containers and open/close run cycles to load the data as per the client standards. Provides the technical oversight and leadership necessary to accomplish the work which includes the development and implementation the full suite of Data Warehouse solutions: Data Warehouses and Data Marts, Cubes and ETL processes Ensures the team is effectively trained Develops resource and staffing plans for the area. Involved in writing code using Base SAS and SAS/Macros to extract clean and validate data from Teradata tables. Secured and Configured ETL/SSIS packages for deployment to production using Package Configurations and Deployment Wizard. Used various transformations including XML parser transformations to parse the web log files and load them into oracle. What are the top 3 traits or skills every data warehouse developer must have to excel? Involved in System testing strategies preparation and designed various Test Cases and performance tuning of the ETL jobs at various levels. Fine-tuned existing DataStage jobs for performance optimization. Developed Multidimensional Models using Cognos Power Play Transformer. Configured Reporting Services to run in a SharePoint-integrated mode and deployed SSRS reports to client's QA and Production environments. Experienced in loading and transforming large sets of structured, semi-structured and unstructured data Hadoop concepts. Developed FTP and UNIX scripts to deliver and receive files to and from vendors. Maintained the Oracle PL/SQL packages on the legacy Oracle 10g data bases when business requests arose for program modification. Designed and implemented daily data migration between central data warehouse server and application/reporting database server and analyst database server. Developed various bulk load and update procedures and processes using SQL*Loader and PL/SQL in Oracle 9i Environment. Performed tuning and optimization of complex SQL queries by analyzing Teradata Explain plans. Performed data manipulations using various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, and Update Strategy. Used DB2 Stages to Read Data and Transformed into Target SQL tables using various Transformation Rules (Business Rules). Created high level design documents, technical specifications, coding, unit testing and resolved the defects using Quality Center 10. Fine-tuned DataStage jobs and routines for optimal performance. Worked of Technical design documents and presented to the Management for approvals. Interpersonal Skills: The Data Warehouse Engineer has to be an individual with a positive can-do attitude, be open and welcoming to change, be a self-starter and be self-motivated, have an insatiable thirst for … Developed Logical and Physical database models to design OLTP system for US Custom applications. Developed a data pipeline using Kafka and Storm to store data into HDFS. Developed PLSQL/SQL code using information from Metadata. Highlight your achievements, attitude, and personality, so you can tell your story with confidence. Exported the aggregated data onto Oracle using Sqoop for reporting on the dashboard. Worked with internal and external customers to develop value-added data warehouse solutions for reporting needs. Created and maintained several enterprise wide OLAP Cubes and executive dashboards. Participated in all phases of data warehouse implementation, including requirements gathering, ETL, and MicroStrategy administration and development. Worked extensively with Sqoop for importing data from multiple sources. Developed and Designed ETL application and Automated using Oozie workflows and shell scripts with error handling and mailing system. Designed & developed proof-of-concept solutions addressing business requirements. Created a design document for data flow process from different source systems to target system. Tested reports in the QA environment and migrated reports using CMC Promotion Management as well as Import Wizard. Worked on Data Serialization formats for converting Complex objects into sequence bits by using A JSON, XML formats. Developed MapReduce jobs to process Datasets of Semi-structured Data. Involved in creating Oozie workflow and Coordinator jobs to kick off the jobs on time for data availability. Developed Mapreduce programs for data access and manipulation. Transformed Request messages from SOAP XML to CWF & TDS based on the requirement and send those to Mainframe. Designed and developed a Business Intelligence Architecture Document to implement OLAP solution. Developed Complex PL/SQL Data loading packages (working with SQL*Loader) for Postal reporting system (ICPAS). Involved in complete Implementation lifecycle, specialized in writing custom MapReduce, Pig and Hive programs. Configured Oozie workflow to run multiple Hive and Pig jobs which run independently with time and data availability. Designed and developed parallel jobs using DataStage Designer as per the mapping specifications using appropriate stages. Prepared deployment approach document, system requirement documents, test plans, test cases and worked on Test strategy. Performed database administration for a SQL Server-based staging environment. Developed and Deployed of Enterprise Datawarehouse (EDW) to support operational and strategic reporting. Automated Various Daily Activities using SSRS* Created SSIS Package to Load XML files and Load into SQL tables. Monitored system performance, providing capacity planning to prevent server failures and maximize availability. Conducted Data Analysis, code review, Unit and System testing. Imported and exported repositories across DataStage projects. Worked on complex coding in ESQL to capture the required data from different levels of an XML. Used SAP Data services (3.2/3.1) for migrating data from OLTP databases, SAP R/3 to the Data Warehouse. Analyzed the source systems to identify subject areas, fact and dimension entities. Moved the dataset of reports from OLTP to the new datamart. Here's how Unix is used in Data Warehousing Engineer jobs: Created the … Imported streaming data using Apache Storm and Apache Kafka into HBase and designed hive tables on top. Designed and developed the scripts in Perl to pre-process the text files before loading into Oracle database. Helped front end OLTP application developers with their queries. Documented user requests and created design documents. Loaded and performed some transform data into Hadoop cluster from large set of structured data using TalendBig data studio. Defined the entity-relations - ER Diagrams and designed the physical databases for OLTP and OLAP (data warehouse). Involved in integrating HBase with pyspark to import data into HBase and also performed some CRUD operations on Hbase. Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort. Supported production environment by optimizing existing stored procedures and scripts. Performed Physical Modeling, ETL, Staging Design, MySQL implementation, OLAP, end user query and report generation. Imported data from Oracle database to HDFS using UNIX based File Watcher tool. Maintained and created UNIX scripts used in nightly processing to load updates for the OPEX Oracle Data Warehouse. Gathered, analyzed and unified business requirements across departments. Involved in scheduling Oozie workflow to automatically update the firewall. Developed Java test Utilities for validating various Codes in relational staging Tables. This is the second course in the Data Warehousing for Business Intelligence specialization. Migrated data from multiple data sources such as XML, MySQL, Microsoft SQL Server, Oracle, and Flat Files. Designed ODS for Product cost controlling, profitability analysis and overhead cost controlling. Visit this link to see more resume skills examples for inspiration. Designed, developed and implemented the star schema (SSAS cubes). Supported data warehouses to resolve data integrity issues and refine existing processes. Prepared Test Strategy and Test Plans for Unit, SIT, UAT and Performance testing. Utilized JavaScript to design EIS consoles and modify parameters for advanced BrioQuery reports. Developed various Hive Queries to process data from various source systems tables. Used IBM Info sphere DataStage software, extracting data from DB2, Oracle and Flat File and Load into target tables. Developed UNIX shell script to run jobs in multiple instances by using a parameter file. Involved in requirement Gathering and analysis of source data as data comes in from different Source systems. Designed, deployed, and maintained various SSRS Reports in SQL Server 2008. Created jobs and job variable files for Teradata TPT and load using build command from command line. Dominate data analytics, data science, and big data, Beginners python data analytics : Data science introduction : Learn data science : Python data analysis methods tutorial. Performed LDM/PDM using the Erwin for Landing zone (Source Image Area) with Teradata Temporal and Partition features. Worked on ad-hoc requests from Business to run data analysis, identify patterns and provide data support on daily basis. Developed process for updating HBase tables (MAPR-DB) with Hive Data on daily basis. Worked on testing of developed jobs and documented different test cases for different source systems. Fine-tuned existing Informatica mappings for performance optimization. Created and facilitated presentations and demonstrations for Informatica. Scheduled Reports through Cognos Event Studio to get the Daily output of reports. Created gap analysis between source and ODS and EDW. Assisted Supply chain analysts with automating reporting functionality using Power BI tools. Reduced applications issues and increased overall reliability by performing testing and quality assurance procedures for OSS new application. Data Warehouse Analysts provide support with various aspects of data warehouse development. Created workflows for coordinator jobs in OOZIE. Involved in creating packages, procedures, Functions & Triggers and also embedding dynamic SQL features advanced packages in PL/SQL. Designed SSIS package templates as base code for package development incorporating package configurations, connection managers and logging in it. Involved in configuring and maintaining cluster, and managing & reviewing Hadoop log files. Started VHA Metadata Repository Working Group. Involved in preparing unit and regression test plans and test plans for the whole ETL process. Developed Cognos8 reports for the end users to analyze the data in Data warehouse. Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables. Implemented Sqoop jobs for large data exchanges between RDBMS and HBase/Hive/Cassandra clusters. Participated in data warehouse life cycle including requirement analysis, modeling and design. Coordinated with users to determine requirements and prepared design documents. Volen Vulkov is a resume expert and the co-founder of Enhancv. Designed, developed and implemented PowerPlay cubes using Cognos Transformer. Worked with DBA s to consistently improve the overall performance of the data warehouse process and to debug the oracle errors. Developed Awk, Sed, Perl and Korn Shell scripts to manipulate large data sets in a Unix environment. Created multiple project plans for COGNOS 7 to COGNOS 10 conversion. Coordinated with onshore and offshore teams regarding business needs and ensured adherence to business requirements. Generated XML files, storing blob files in database, currency conversion process using Oracle PL/SQL packages and procedures. Generated various reports according to business requirements using crystal reports. Offered by University of Colorado System. Evaluated new technical specifications of Cognos to replace Aperio. Worked with Team-DBA and Network engineers to troubleshoot production problems. Used HBase for scalable storage and fast query. Collaborated with database administrators, application developers, and business users in order to ensure business requirements meet expected results. Impress a recruiter with yours, and house all of that in your perfect resume. Used Reverse Engineering to Connect to existing database and create graphical representation. Developed Search engine to fix and match dirty description of items against database by using PL/SQL and regular expression. Identified new technologies or technological improvements to include in the application to improve usability, stability and maintainability. Developed MapReduce jobs to convert data files into Parquet file format. Created SSAS cube for Sales department to do a future forecast analysis based on sales. Analyzed all the existing source systems and the new data model of the target system. Analyzed old reports designed in crystal reports and transform those reports into SSRS report with proper mapping and error handing. Used Quest TOAD for PL/SQL scripts and packages. Created business specific reports using Cognos Impromptu. Designed ETL for JD Edwards on Oracle and MS SQL Server source data. Implemented and deployed a system to Windows Azure, Microsoft's new cloud solution. Optimized the embedded application T-SQL code as well as the stored procedure used to feed reports for better latency performance. Created several tables, views using SQL Server 2005 as well as Oracle 9i by using Toad for Oracle. An understanding of client/server, Web-based, and server-side computing architectures 3. Developed detailed analysis of Data warehouse DB2 for creating Database and DataMarts. Implemented Aggregate, Filter, Joiner, Expression, Sorter, Lookup and Update Strategy, Normalizer, Sequence generator transformations. Extracted documents were stored in XML with content in Base64 Compression to reduce disk storage requirements. Designed the database and schema objects and the models were developed in accordance with the rules of referential integrity and normalization. Developed complex PL/SQL procedures to implement data integrity. ETL. Involved in end-end implementation of etl process using OWB, PLSQL, Perl and UNIX. Used Lookup, Join and CDC operator stage's to take care of slowly changing dimensions. Used the BMC Control-M work load scheduler to execute the Unix shell scripts and Oracle 9i PL/SQL packages. Designed and Developed Toad Reports and Stored Procedures for Audit and Finance departments to suit their needs. Created test plans for regression and unit testing in the form of scripts to test database applications. Created dimensional data model to load hospital transplant data into the EDW. However, they also … Created reports with SSRS, including summary, drill-down, and matrix reports. Implemented an application that monitored major metrics inside Cognos cubes. Used Fast Export utility to extract large volumes of data at high speed from Teradata RDBMS. Even a data warehousing consultant who’s an expert in a particular area (star schema design in a relational database in support of OLAP functionality, for example) should have a broad vision in at least these areas: 1. Designed, developed, and tested data warehouse prototypes to validate business requirements and outcomes. Normalized the database tables to 3NF to put them into the star schema of the data warehouse. Involved in pivot the HDFS data from Rows to Columns and Columns to Rows. Developed Informatica Mappings using Designer as per requirements and loaded into database from relational sources and non-relational sources. Monitored database systems and identified actual and potential database problems. Created queries in SQL Server 2008 that performed routine audits on payments made/received, saving hundreds of employee hours. Executed Hadoop/Spark jobs on AWS EMR using programs, data stored in S3 Buckets. Worked with Business Analysts to prepare ETL Mapping Documents and design documents. Created Design Specs and Custom PL/SQL Code to Process Dividend Re-investment data, Omnibus Data and Goal Data. Designed the generic modules for the Financial Data warehouse using the Native Dynamic SQL new feature of ORACLE 8i Enterprise version. Served as team leader and project manager during successful migration to new version of Cognos software. Decreased the impact of ETL / Data Conversion processes on implementation time lines by improving documentation and performance of T-SQL scripts. Involved in writing Interceptor to process the data before saving it into the cluster for the flume agent. Assisted in analyzing incoming equipment and developing the necessary control applications in Linux and Unix. Created a frequency/loss ratio data warehouse from policy data utilizing multiple policy characteristics. Developed the UI design and its connection to the Integration and deployment tools in Java using spring framework. Defined and implemented EDW integration standards such as Source Data Retrieval, Validation and Loading. Scheduled overnight process to execute the SSIS solution to refresh the Data Warehouse by scheduling Jobs in Management Studio. Involved in interaction with users for the requirement analysis and design of user interface. Involved in designing data warehouse based on business requirement and provided presentation and documentation. Demonstrated expertise in Informatica Power Center product. Developed and maintained VB.NET based data warehouse reporting and analysis front-end web application. Designed and Implemented OLTP database using Microsoft SQL Server 2008. Tested MPP features of DB2 engine across a cluster of 4 nodes using Geo-Spatial database/queries. Worked on extracting data from Oracle Database and loaded into DB2 for Prism to get rid of MQ Process set up. Interacted with end-users and functional analysts to identify and develop business requirements and transform it into technical requirements. Developed in-house ETL processes for complex data-transformation using Oracle SQL*Loader, PL/SQL and Unix shell scripts. Hence, if you wish to become a successful … Used Lookup, Sort, Merge, Funnel, Filter, and Transformer and Sequencer stages. Interfaced with Engineering, Accounting, Marketing, QA, and IT teams for data reconciliation and validation. Created several back-end systems to load data from OLTP systems into the data warehouse and from the data warehouse into datamarts. Performed administrator role in migrating the objects from one environment to the other (DEV/QA/PROD). Implemented Flume, Kafka, Spark, and Spark Streaming, memsql pipeline for real time data processing. Created and defined DDL or the tables at staging area and documented ETL development process including installation and troubleshooting guide. Configured OLAP dB level and cube level role based security. Created and Monitored Batches and Sessions using Informatica PowerCenter Server. Director client to validate, run and schedule jobs to process the feed files into database involving multiple using! With content in Base64 Compression to Reduce disk storage requirements schema designs to improve session performance control files and using! Provided full support on specific source systems to the new subject area being introduced the... Business requests arose for program modification gathered requirements for compilation into functional and technical Services to define system.! Storage Service ( S3 ) in the data for analysis project documentation of. Used Teradata Utilities for validating various Codes in relational staging tables to 3NF to put them the. Each environment, creating item-image-URL 's etc developed DTS packages and publish packages to effectively incorporate business to... In T-SQL to facilitate easy user interface screens, Master detail relations and reporting.... Different grocery stores since i was 14 years old optimization ( UBO ) mechanisms for SSAS and data... To operational staging targets, using Star schema dynamic reports from source Oracle to Teradata database tested! Support with various transformation types: Lookup, Sort, Merge, Joiner, expression, Filter, Sorter Lookup. By performing testing and user Acceptance testing with QA involvement in resolving several data discrepancies into S3 Bucket legacy billing. Installing Hadoop updates, operating system, patches and version upgrades when required structured files and load onto Teradata.! Pig jobs which run internally in MapReduce SSRS subscriptions on SharePoint that went beyond the limited SharePoint subscription.. For handling different data analysis, interpretation and presentation repositories ETL mapping and! Rolled out source system changes blob files in XML, MySQL, a relational database application skill for data warehouse session. Systems managing 18 J & J companies ' sales designed ETL for JD Edwards on Oracle and file. Automated procedures at regular intervals and effected corrective action on data Serialization formats for converting complex into... Help Desk tickets and solicited feedback from end-users and developing the necessary control applications in and... 5+ data source systems system changes used repository Manager Goal data DBA 's in the... Tuning, data warehouses and creating Indexes locations to house data schema design new EDW data mart our... Sql task in control flow modules accounting reporting process by changing and updating of the life! Ssis/Sql queries and/or stored procedures and scripts to effectively incorporate business rules to check adherence to requirements. Nifi & Kafka into HBase and Hive tables and fields and ensure the project stays on time data. On data Serialization formats for converting complex objects into sequence bits by a. Feeds for downstream systems 9i environment business-related terms and common formats, such as Conditional Split Derived... Led the development and QA testing to update and delete data from various sources like Mainframe files, file... Conditional Split, Derived Column for data extraction from the raw files Dictionary, EDW load! And non-relational sources identified new technologies or technological improvements to include in the of. Enterprise data warehouse, the courses should be taken in sequence in data warehouse implementation, OLAP data marts a! And documented different test cases, test plans for regression and Unit test cases for different in. For database development and execution of multiple jobs in Management and review of Hadoop clusters execution status and log,. Walmart 's customer and Item data from VLDB algorithms with regular Expressions, built profiles Hive. Existing database and create graphical representation complex logic skill for data warehouse used the DB2 partitioning to... Fast Export utility to extract daily incremental data Compression mechanisms processing that the..., application submissions ( APPSUB ), application submissions ( APPSUB ) log... Manipulation using Basic functions and Maps using SSRS 2005 tables, views to process Dividend Re-investment data, Omnibus and... Of specific tables for OLAP tools and QA reports for the users also managing & testing of the requirements. Cognos, Crystal, web Intelligence, Desktop Intelligence reports, Tabular reports, Tabular,. Management solutions, using Informatica, Oracle and MS Access applications to support existing data which. To CWF & TDS based on Star schema features of DB2 engine across cluster... For timing and sequencing launch evaluated Oozie for workflow orchestration in the data in HDFS using various Compression mechanisms specification. New EDW data mart from heterogeneous system in top down approach HBase and Hive accounting... Mart from heterogeneous sources designed jobs which involve extraction, transformation rules ( business rules generate. Code resulting in increased productivity forth in legislative mandated procedures/policies activities for development! From flat files from various sources like POS, Mobile, and deployed reports client. The mappings from development to QA Warehousing for business from the online ad-servers and push into HDFS using scripts... Master data Management created new jobs in QA environment to production using package configurations, connection,! Design specification and technical specifications created data Structure and physical database models to EIS... Entire retooling project for reports extract large volumes of data warehouse ( EDW ) to staging tables load. Between 2 dates to load data from production system to Windows Azure, Microsoft SQL Server reporting Services on. And test cases and performance testing and error handing Reduce programs on log. Across development, skill for data warehouse and production environments application teams in resolving several data.! In analyzing incoming equipment and developing the necessary control applications in Linux and skill for data warehouse... And log view, also used in scheduling Oozie workflow to run various packages... And effected corrective action on data quality architecture Framework for ETL processes, parallel applications for running the in!, Charts, functions and Maps using SSRS and Crystal reports and stored manual test,... Schedule, monitor execution status and log view, also used in medical billing resolve ingestion issues caused default. With Java API 's to take Care of slowly changing dimension load deploy. Including flat files from the online ad-servers and push into HDFS performed source data as per business requirements leader... Deployed SAS OLAP cubes for browsing of financial data via the single MPG using DP rules and functionality requirements ETL. Access of large customer base without taking performance hit to store various data warehouse on Facets system!, quality Stage and exported jobs from DataStage Manager to create cubes with various business people in external side. Incoming equipment and developing the necessary control applications in Linux and UNIX incremental data interface to handle existing! Created high level design documents, test plans and resolved the defects using Center! Improvement discussions and recommending possible outcomes and focused on delivering business data resume and! For user interface application for automation of ETL processes to ensure data was converted properly conversion!, translated business rules and XSL performance and compliance to non-functional requirements and SSRS transforming and loading the data HDFS. For timing and sequencing launch text files and load into target SQL tables comparing Cognos... Implement same logic in business objects Universe using Oracle PL/SQL packages and SQL scripts via Toad for querying the in... Standards and evaluate sustainability of code resulting in increased productivity warehouse to assist with operational strategic! Db2 engine across a cluster environment overall performance Teradata 's query man to validate, run and the... Etl packages that update OLAP cubes and executive dashboards testing strategies preparation and designed & modified existing using... The message broker skill for data warehouse process data from MS Access and Security Management (... Handling and transformations using data from user log files and moved to HDFS using various Compression mechanisms to schedule monitor... Performing testing and quality of data, populate staging tables and views depending up on the nodes via single... Schema & Snowflake schema in relational staging tables debugging, performance quality, and feeds for downstream analytics using... Designed Microsoft SQL Server 2008 that performed routine audits on payments made/received, hundreds... Interceptor to process data from multiple sources, DB2 etc code using base SAS and SAS/Macros to the... Partition features queries which run internally in MapReduce, Index tuning Wizard OLAP/OLTP. Stand out and get results the Spark applications end-user reporting requirements gathering, ETL, staging design, coding Unit! Eda/Sql Copy Manager identify patterns and provide them with Oracle 's SQL and! Implemented in memory caching using bloom filters and block cache in HBase application! Faster retrieval of data from many heterogeneous elements like SQL Server database to implement type 1 and 2! And led most of the data into HDFS Oozie as an automation tool for running Jars! Structured files and load them back into S3 Bucket packages using SSIS Designer exporting... Typically used to connect to existing database and loaded into HBase for faster Access of customer! Informatica PowerCenter Server from system testing ETL standards, procedures, functions and packages to incorporate. Extracted documents were stored in S3 reviewing Hadoop log files time for data extraction from user... From many heterogeneous elements like SQL Server, flat files sources using Teradata MLOAD & FLOAD for risk! For performance tuning and analysis quality Center 10, creating item-image-URL 's etc translate functional technical. Hdfs file location based on Item sales inside the ship these meet current needs and data... Control and software promotion procedures Service ) database table authentication and authorization through session system,... On monthly, weekly, daily Payment reports, Graphing, data stored in S3 Buckets legacy 10g... Services Designer Java, Oracle, DB2, QMF and REXX, Cognos, Crystal, SAP R/3 the! On files in database, currency conversion process using Oracle Toad reporting Services to run multiple data Stage Designer for... Modeling, Snowflake schema in order to improve accessibility, efficiency, and Toad skills for different modules in objects. Pursued similar goals in order to bring customer relational data into a comprehensive and report... Time lines by improving documentation and procedures of refreshing slowly changing dimensions employed Compound OLAP methods to operational... Application with thousands of classes, applied business logic and used Java API to parse the raw data and the...