Wrote many system utility programs to
perform data loading, data parsing and database backups that run automatically
Not exact matches
Noakes12 recently argued from the
data of Baldwin et al13 that when exercise was
performed at a constant work rate (thus negating any effect of pacing strategy) to volitional fatigue in either a glycogen -
loaded or glycogen - depleted state, performance time was determined by the rate at which the RPE rose from its starting value to a maximal tolerable value.
Therefore collectively, the
data appears to indicate that during pull - ups
performed with comparable
loads, changes in stability and forearm motion do not affect latissimus dorsi muscle activity.
They transform,
load the ETL and
perform the
data management processes efficiently.
Features: With the appropriate applications,
perform the following operations: Read diagnostic trouble codes, both generic and manufacturer - specific, and display code descriptions with over 3000 universal code definitions in the database Clear diagnostic trouble codes and turn off check engine lights Display and record real - time sensor
data, including: · Vehicle Speed · RPM · Fuel Consumption · Engine Coolant Temp · Fuel Pressure · Calculated Engine
Load · Throttle Position · Intake Manifold Pressure · Air Intake Temp · Timing Advance · Mass Air Flow · Fuel Level · Barometric Pressure · EVAP System Vapor Pressure · Fuel Trim · + More Note: Your vehicle may not support all engine metrics.
They're the tests we
performed in our Note 8 real - world analysis: We put together a tool to extract and parse the frame
data, and a UI automation system that allowed us to build macros that mimic real - world use cases by simulating touch input — scrolling,
loading new activities or windows, and compound tests with complex UI navigation.
They
perform data organization, modeling, integration and other tasks to gather the data in a specific format using different processes and applications such as ETL (extract, transform and load), Informatica, Erwin Data Modeler, etc., and develop database warehousing sys
data organization, modeling, integration and other tasks to gather the
data in a specific format using different processes and applications such as ETL (extract, transform and load), Informatica, Erwin Data Modeler, etc., and develop database warehousing sys
data in a specific format using different processes and applications such as ETL (extract, transform and
load), Informatica, Erwin
Data Modeler, etc., and develop database warehousing sys
Data Modeler, etc., and develop database warehousing system.
Those interested in a Mailroom Assistant position should be able to
perform the following activities: receiving mail,
loading mail on delivery carts, maintaining incoming mail records, assisting with payroll checking, maintaining inventories, archiving documents, and doing
data entry work.
Performed on - boarding of new
data sources by identifying file transfer protocols, analyzing
data, pre-processing
data, and staging the
data for
loading onto database.
Assist with setting up training for safety councils, track training of field employees,
perform data entry,
load jobs and employee information.
Perform operations and administrative support, answer calls, manage
data, schedule trucks, and verify
load info.
Perform data entry,
loading, and validation, support production scheduler, and analyze root causes of issues and take corrective measures.
Oversaw 100 % of all
data warehouse tables, ETL (extract, transform,
load), Cubes, Frameworks, Models, SQL writing, documenting, and publishing initiatives, and
performed all database changes for OLTP databases and OLAP databases, indexing, table creation, dropping, FK, and
data migration.
Perform Pipe stress analysis as per the input
data (geometry,
load cases) using CAESAR II specific application of static and dynamic analysis for seismic
load conditions.
Perform basic troubleshooting on servers with regards to issue related to
data backup and
loading errors.
Loaded data in MS Excel into Oracle table and implemented SQLs to
perform data migration.
Performed data analysis on 150 + vendor product datasets and led the modeling and development of the initial Enterprises
data loading ETL / ELT solution in SQL 2008 R2.
Reformatted the
data and inserted the reformatted
data into staging table,
performed data cleansing and
data type conversions and then
loaded into destination table
Performed system, unit, performance,
load, regression, stress and
data interface testing.
Professional Duties & Responsibilities Served as technical consultant and engineer for varied technology companies Responsible for major client accounts including Emerson Motors Therm - O - Disc Designed, developed, and launched multiple programs for a variety of applications Created backend logic for E-Warranty System and integrated with Oracle E-Business Suite Database using PL / SQL procedures Wrote scheduled interfaces for
data extraction and
loading into Oracle E-Business Suite Developed Oracle Reports using Oracle Developer Suite 10g, XML Publisher, and OBIEE Significant experience in design, development, and testing of E-Business Suite Oracle Apps 11i, D2K Forms, Reports 6i, XML Publisher Reports, Discoverer Reports, Conversion & Interfaces, Oracle apps modules inventory, Order Management, BOM, WIP, AR, AP, GL, and Install Base Completed multiple academic projects utilizing Java Swings for Object Oriented Programming, VB6, Oracle Database, Oracle 9i, Developer 2000, and Autocad
Performed all duties in a professional, positive, and timely manner
FISERV ISS, Senior Database Manager (Denver, CO) 10/2000 — 8/2008 • Provide production database administration for Fortune 500 firm •
Perform 24/7 on - call support of Oracle and SQL Server databases varying in size (800 GB to 4 GB) • Oversee backup and recovery using RMAN • Responsible for daily production maintenance of Oracle and SQL Server databases including log checking, space management and database tuning • Maintain the Oracle Data Guard Standby databases by syncing up the database with production • Perform 90 - day restores of multiple Oracle and SQL Server databases • Actively train coworkers in database restoration • Convert Oracle non-ASM databases to ASM databases, perform upgrades • Establish physical database parameters and review internal Oracle tables • Identify and recover corrupted blocks on Oracle database • Install multiple Oracle ASM and non-ASM databases on both Linux and Windows platforms • Install Oracle 10.2 two node ASM RAC system on Linux Itanium system • Install SQL Server databases on Windows 2000 platform • Develop Oracle Data models for internal applications • Perform SQL tuning using explain plan and Oracle Grid Control • Create multiple Cron UNIX scripting jobs using the Bash Shell • Maintain nightly internal load job application processes • Write multiple internal applications with PL / SQL l
Perform 24/7 on - call support of Oracle and SQL Server databases varying in size (800 GB to 4 GB) • Oversee backup and recovery using RMAN • Responsible for daily production maintenance of Oracle and SQL Server databases including log checking, space management and database tuning • Maintain the Oracle
Data Guard Standby databases by syncing up the database with production •
Perform 90 - day restores of multiple Oracle and SQL Server databases • Actively train coworkers in database restoration • Convert Oracle non-ASM databases to ASM databases, perform upgrades • Establish physical database parameters and review internal Oracle tables • Identify and recover corrupted blocks on Oracle database • Install multiple Oracle ASM and non-ASM databases on both Linux and Windows platforms • Install Oracle 10.2 two node ASM RAC system on Linux Itanium system • Install SQL Server databases on Windows 2000 platform • Develop Oracle Data models for internal applications • Perform SQL tuning using explain plan and Oracle Grid Control • Create multiple Cron UNIX scripting jobs using the Bash Shell • Maintain nightly internal load job application processes • Write multiple internal applications with PL / SQL l
Perform 90 - day restores of multiple Oracle and SQL Server databases • Actively train coworkers in database restoration • Convert Oracle non-ASM databases to ASM databases,
perform upgrades • Establish physical database parameters and review internal Oracle tables • Identify and recover corrupted blocks on Oracle database • Install multiple Oracle ASM and non-ASM databases on both Linux and Windows platforms • Install Oracle 10.2 two node ASM RAC system on Linux Itanium system • Install SQL Server databases on Windows 2000 platform • Develop Oracle Data models for internal applications • Perform SQL tuning using explain plan and Oracle Grid Control • Create multiple Cron UNIX scripting jobs using the Bash Shell • Maintain nightly internal load job application processes • Write multiple internal applications with PL / SQL l
perform upgrades • Establish physical database parameters and review internal Oracle tables • Identify and recover corrupted blocks on Oracle database • Install multiple Oracle ASM and non-ASM databases on both Linux and Windows platforms • Install Oracle 10.2 two node ASM RAC system on Linux Itanium system • Install SQL Server databases on Windows 2000 platform • Develop Oracle
Data models for internal applications •
Perform SQL tuning using explain plan and Oracle Grid Control • Create multiple Cron UNIX scripting jobs using the Bash Shell • Maintain nightly internal load job application processes • Write multiple internal applications with PL / SQL l
Perform SQL tuning using explain plan and Oracle Grid Control • Create multiple Cron UNIX scripting jobs using the Bash Shell • Maintain nightly internal
load job application processes • Write multiple internal applications with PL / SQL language
Professional Experience Client — XL Insurance (Hartford, CT) 6/2008 — Present Role — Business Intelligence Solutions Consultant — Insurance
Data Warehouse • Participate in information - gathering sessions to determine and assess project requirements, identifying best - fit architecture solutions in line with enterprise data warehouse architectural standards • Work closely with the data modeler and the DBA in the design of the logical and physical data model • Create and maintain models for Cognos, performing extensive STAR Schema modeling to enable reporting decentralization and allow for user - driven ad - hoc reporting as well as drawing upon SSRS and OBIEE reporting solutions • Strategize with the ETL team to identify the best case design strategy for ETL - related activities including ETL design patterns determination, load strategies, load timing and frequency, and data retrieval expectations determination • Participate in providing Rough order of Magnitudes (ROM) estimates in and out of release projects, estimating resource requirements and managing within determined time constraints • Assist in the development of security tools in Cognos 8 using LDAP and Active directory while holding responsibility for maintaining run books and project documentation in Sharep
Data Warehouse • Participate in information - gathering sessions to determine and assess project requirements, identifying best - fit architecture solutions in line with enterprise
data warehouse architectural standards • Work closely with the data modeler and the DBA in the design of the logical and physical data model • Create and maintain models for Cognos, performing extensive STAR Schema modeling to enable reporting decentralization and allow for user - driven ad - hoc reporting as well as drawing upon SSRS and OBIEE reporting solutions • Strategize with the ETL team to identify the best case design strategy for ETL - related activities including ETL design patterns determination, load strategies, load timing and frequency, and data retrieval expectations determination • Participate in providing Rough order of Magnitudes (ROM) estimates in and out of release projects, estimating resource requirements and managing within determined time constraints • Assist in the development of security tools in Cognos 8 using LDAP and Active directory while holding responsibility for maintaining run books and project documentation in Sharep
data warehouse architectural standards • Work closely with the
data modeler and the DBA in the design of the logical and physical data model • Create and maintain models for Cognos, performing extensive STAR Schema modeling to enable reporting decentralization and allow for user - driven ad - hoc reporting as well as drawing upon SSRS and OBIEE reporting solutions • Strategize with the ETL team to identify the best case design strategy for ETL - related activities including ETL design patterns determination, load strategies, load timing and frequency, and data retrieval expectations determination • Participate in providing Rough order of Magnitudes (ROM) estimates in and out of release projects, estimating resource requirements and managing within determined time constraints • Assist in the development of security tools in Cognos 8 using LDAP and Active directory while holding responsibility for maintaining run books and project documentation in Sharep
data modeler and the DBA in the design of the logical and physical
data model • Create and maintain models for Cognos, performing extensive STAR Schema modeling to enable reporting decentralization and allow for user - driven ad - hoc reporting as well as drawing upon SSRS and OBIEE reporting solutions • Strategize with the ETL team to identify the best case design strategy for ETL - related activities including ETL design patterns determination, load strategies, load timing and frequency, and data retrieval expectations determination • Participate in providing Rough order of Magnitudes (ROM) estimates in and out of release projects, estimating resource requirements and managing within determined time constraints • Assist in the development of security tools in Cognos 8 using LDAP and Active directory while holding responsibility for maintaining run books and project documentation in Sharep
data model • Create and maintain models for Cognos,
performing extensive STAR Schema modeling to enable reporting decentralization and allow for user - driven ad - hoc reporting as well as drawing upon SSRS and OBIEE reporting solutions • Strategize with the ETL team to identify the best case design strategy for ETL - related activities including ETL design patterns determination,
load strategies,
load timing and frequency, and
data retrieval expectations determination • Participate in providing Rough order of Magnitudes (ROM) estimates in and out of release projects, estimating resource requirements and managing within determined time constraints • Assist in the development of security tools in Cognos 8 using LDAP and Active directory while holding responsibility for maintaining run books and project documentation in Sharep
data retrieval expectations determination • Participate in providing Rough order of Magnitudes (ROM) estimates in and out of release projects, estimating resource requirements and managing within determined time constraints • Assist in the development of security tools in Cognos 8 using LDAP and Active directory while holding responsibility for maintaining run books and project documentation in Sharepoint
Afterward, confirmatory factor analyses using the 11 items of the PNS - J as indicators were
performed to examine whether the two - factor model — 4 items
loaded on the Desire for Structure factor and the other 7 items
loaded on the Response to Lack of Structure factor — fits the
data better than the one - factor model.