Dear Readers, Welcome to Event Manager Objective Questions and Answers have been designed specially to get you acquainted with the nature of questions you may encounter during your Job interview for the subject of Event Manager Multiple choice Questions. These Objective type Event Manager are very important for campus placement test and job interviews. As per my experience good interviewers hardly plan to ask any particular question during your Job interview and these model questions are asked in the online technical test and interview of many IT & Non IT Industry.
A. Sort
B. ETL
C. Format
D. String
Ans: D
A. Web Scraping
B. Data inspection
C. OLE DB Source
D. OLE DB Destination
Ans: D
A. These are used to identify which fields from which sources are going to with destinations. It allows the ETL developer to identify if there is a need to do a data type change or aggregation prior to beginning coding of an ETL process.
B. These can be used to flag an entire file-set that is ready for processing by the ETL process. It contains no meaningful data bu the fact it exists is the key to the process.
C. Data is pulled from multiple sources to be merged into one or more destinations.
D. It is used to massage data in transit between the source and destination.
Ans: A
A. Pull Method
B. Push and Pull
C. Load in Parallel
D. Union all
Ans: B
A. Process to move data from a source to destination.
B. Transactional database that is typically attached to an application. This source provides the benefit of known data types and standardized access methods. This system enforces data integrity.
C. All data in flat file is in this format.
D. This control can be used to add columns to the stream or make modifications to data within the stream. Should be used for simple modifications.
Ans: B
A. Process to move data from a source to destination.
B. The easiest to consume from the ETL standpoint.
C. Two methods to ensure data integrity.
D. Many routines of the Mainframe system are written in this.
Ans: D
A. Data inspection
B. Transformation
C. Extract, Transform, Load
D. Data Flow
Ans: C
A. Custom
B. Automation
C. Pull Method
D. Push Method
Ans: D
A. These are used to identify which fields from which sources are going to with destinations. It allows the ETL developer to identify if there is a need to do a data type change or aggregation prior to beginning coding of an ETL process.
B. These can be used to flag an entire file-set that is ready for processing by the ETL process. It contains no meaningful data bu the fact it exists is the key to the process.
C. ETL can be used to automate the movement of data between two locations. This standardizes the process so that the load is done the same way every run.
D. This is used to create multiple streams within a data flow from a single stream. All records in the stream are sent down all paths. Typically uses a merge-join to recombine the streams later in the data flow.
Ans: B
A. Similar to "break up processes", checkpoints provide markers for what data has been processed in case an error occurs during the ETL process.
B. Similar to XML's structured text file.
C. Many routines of the Mainframe system are written in this.
D. It is used to import text files for ETL processing.
Ans: A
A. ETL
B. XML
C. Sort
D. EBCDIC
Ans: D
A. ETL
B. Custom
C. OLTP
D. Sort
Ans: B
A. Many routines of the Mainframe system are written in this.
B. Data is pulled from multiple sources to be merged into one or more destinations.
C. It allows multiple streams to be created from a single stream. Only rows that match the criteria for a given path are sent down that path.
D. This is used to create multiple streams within a data flow from a single stream. All records in the stream are sent down all paths. Typically uses a merge-join to recombine the streams later in the data flow.
Ans: C
A. The easiest to consume from the ETL standpoint.
B. Three components of data flow.
C. Three common usages of ETL.
D. Two methods to ensure data integrity.
Ans: A
A. OLTP
B. Mainframe
C. EBCDIC
D. Multicast
Ans: D
A. Mainframe
B. Union all
C. File Name
D. Multicast
Ans: A
A. File Name
B. Mainframe
C. Format
D. Union all
Ans: A
A. Format
B. COBOL
C. Tool Suite
D. Flat files
Ans: C
A. Data Scrubbing
B. EBCDIC
C. String
D. Web Scraping
Ans: D
A. Three components of data flow.
B. It is used to import text files for ETL processing.
C. The easiest to consume from the ETL standpoint.
D. Shows the path to the file to be imported.
Ans: B
A. Sources, Transformation, Destination
B. Data inspection
C. Row Count Inspection, Data Inspection
D. Row Count Inspection
Ans: C
A. Data is pulled from multiple sources to be merged into one or more destinations.
B. It is used to import text files for ETL processing.
C. Process to move data from a source to destination.
D. It is used to massage data in transit between the source and destination.
Ans: D
A. Data Scrubbing
B. Sources, Transformation, Destination
C. Merging Data
D. Merging Data, Data Scrubbing, Automation
Ans: D
A. A value of delimited shou;d be selected for delimited files.
B. Data is pulled from multiple sources to be merged into one or more destinations.
C. This will reduce the run time of ETL process and reduce the window for hardware failure to affect the process.
D. this should be check if column name have been included in the first row of the file.
Ans: C
A. Hard Drive I/O
B. Mainframe
C. Tool Suite
D. Data Scrubbing
Ans: A
A. Sort
B. Format
C. String
D. OLTP
Ans: B
A. Row Count Inspection, Data Inspection
B. Format of the Date
C. Column names in the first data row checkbox
D. Do most work in transformation phase
Ans: C