Informatica Interview Questions

Showing Questions 41 - 51 of 51 Questions
First | Prev | Next | Last Page
Sort by: 
 | 
Jump to Page:
  •  

    What is a Source Qualifier?

    It represents all data queried from the source.

    Star Read Best Answer

    Editorial / Best Answer

    sprajarajan  

    • Member Since Mar-2008 | Aug 8th, 2008


    Source Qualifier Is the default Transformation.
    Through The source Qualifier Transformation Informatica Reads The Data.
    We can Filter The Data.
    We can sort the Data.
    Its also Used to Join Homogeneous Source systems.
    We can Join Any number of Sources in Singlae Source Qualifier.
    We Can't Join the Flatfiles In sourcequalifier Because Flatfiles Are Heterogeneous When we open the Flatfiles At sourcequalifier At the time All The options are Disabled.

    joyfun23

    • Jan 26th, 2010

    1. Source Qualifier is the most important transformation which convert the source data type in to compatible NATIVE datatype of a mapping.2. Without a SQ a mapping can not be created,?after extractio...

  •  

    What is Data Transformation Manager?

    After the load manager performs validations for the session, it creates the DTM process. The DTM process is the second process associated with the session run. The primary purpose of the DTM process is to create and manage threads that carry out the session tasks.· The DTM allocates process memory for the session and divide it into buffers. This is also known as buffer memory. It creates the main thread,...

    Star Read Best Answer

    Editorial / Best Answer

    Goush  

    • Member Since May-2007 | May 15th, 2007


    When the workflow reaches a session, the Integration Service process starts
    the DTM process. The DTM is the process associated with the session and it
    performs the following tasks:


    1. Retrieves and validates session information from the repository.


    2. Performs pushdown optimization when the session is configured for
    pushdown optimization.


    3. Adds partitions to the session when the session is configured for dynamic
    partitioning.


    4. Forms partition groups when the session is configured to run on a grid.


    5. Expands the service process variables, session, and mapping variables and
    parameters.


    6. Creates the session log.


    7. Validates source and target code pages.


    8. Verifies connection object permissions.


    9. Runs pre-session shell commands, stored procedures and SQL.


    10. Sends a request to start worker DTM processes on other nodes when the
    session is configured to run on a grid.


    11. Creates and runs mapping, reader, writer, and transformation threads to
    extract, transform, and load data.


    12. Runs post-session stored procedures, SQL, and shell commands


    13. Sends post-session email.

    chandrarekha

    • Sep 6th, 2007

    Load manager and DTM are the components of Informatica server.  Load manager manages the load on the server by maintaining a queue of sessions and release the session based on first come and firs...

  •  

    Discuss the advantages & Disadvantages of star & snowflake schema?

    Star Read Best Answer

    Editorial / Best Answer

    Answered by: swati

    • Nov 5th, 2005


    star schema consists of single fact table surrounded by some dimensional table.In snowflake schema the dimension tables are connected with some subdimension table.

    In starflake  dimensional ables r denormalized,in snowflake dimension tables r normalized.

    star schema is used for report generation ,snowflake schema is used for cube.

    The advantage of snowflake schema is that the normalized tables r easier to maintain.it also saves the storage space.

    The disadvantage of snowflake schema is that it reduces the effectiveness of navigation across the tables due to large no of joins between them.

    halgai

    • Oct 8th, 2007

    Snowflakes are an addition to the Kimball Dimensional system to enable that system to handle hierarchial data.  When Kimball proposed the dimensional data warehouse it was not first recogonized t...

    Richard

    • Feb 2nd, 2006

    It depends upon the clients which they are following, whether snowflake or star schema.

  •  

    What are Target Types on the Server?

    Target Types are File, Relational and ERP.

    Star Read Best Answer

    Editorial / Best Answer

    manojkumar_dwh  

    • Member Since Apr-2007 | Apr 14th, 2007


    PowerCenter can load data into the following targets:

    • Relational. Oracle, Sybase, Sybase IQ, Informix, IBM DB2, Microsoft SQL Server, and Teradata.
    • File. Fixed and delimited flat file and XML.
    • Application. You can purchase additional PowerCenter Connect products to load data into SAP BW. You can also load data into IBM MQSeries message queues and TIBCO.
    • Other. Microsoft Access.

    You can load data into targets using ODBC or native drivers, FTP, or external loaders.

  •  

    What are Aggregate transformation?

    Aggregator transformation allows you to perform aggregate calculations, such as averages and sums.

    Star Read Best Answer

    Editorial / Best Answer

    Answered by: Praveen vasudev

    • Sep 12th, 2005


    Aggregator transform is m uch like the Group by clause in traditional SQL.

    this particular transform is a connected/active transform which can take the incoming data form the mapping pipeline and group them based on the group by ports specified and can caculated aggregate funtions like ( avg, sum, count, stddev....e.tc) for each of those groups.

    From a performanace perspective if your mapping has an AGGREGATOR transform use filters and sorters very early in the pipeline if there is any need for them.

    veepee

    sivakp

    • Mar 13th, 2011

    1. Aggrigator transformation allows to perform aggrigate calculation, such as SUM, MAX, MIN, FIRST, LAST2. Aggrigator transformation allows to perform aggrigate calculation of group.

    shr_4

    • Oct 27th, 2010

    To perform Group by calculations  we use Aggregator Transformation.It perform calculations similar to Expression Transformation.But difference  between both is that Aggregator Transform...

  •  

    What is Load Manager?

    Star Read Best Answer

    Editorial / Best Answer

    Answered by: Deb

    • Jun 8th, 2005


    I am providing the answer which I have taken it from Informatica 7.1.1 manual, 
     
    Ans: While running a Workflow,the PowerCenter Server uses the Load Manager process and the Data Transformation Manager Process (DTM) to run the workflow and carry out workflow tasks.When the PowerCenter Server runs a workflow, the Load Manager performs the following tasks: 
     
    1. Locks the workflow and reads workflow properties. 
    2. Reads the parameter file and expands workflow variables. 
    3. Creates the workflow log file. 
    4. Runs workflow tasks. 
    5. Distributes sessions to worker servers. 
    6. Starts the DTM to run sessions. 
    7. Runs sessions from master servers. 
    8. Sends post-session email if the DTM terminates abnormally. 
     
    When the PowerCenter Server runs a session, the DTM performs the following tasks: 
    1. Fetches session and mapping metadata from the repository. 
    2. Creates and expands session variables. 
    3. Creates the session log file. 
    4. Validates session code pages if data code page validation is enabled. Checks query 
    conversions if data code page validation is disabled. 
    5. Verifies connection object permissions. 
    6. Runs pre-session shell commands. 
    7. Runs pre-session stored procedures and SQL. 
    8. Creates and runs mapping, reader, writer, and transformation threads to extract,transform, and load data. 
    9. Runs post-session stored procedures and SQL. 
    10. Runs post-session shell commands. 
    11. Sends post-session email.

    raju4dw

    • Aug 6th, 2010

    Load Manager is responsible for dispatching the sessions and maintains Q of sessions based on first in and first out. Load Manager is also known as Master Processor (DTM) Data Transformation Manager.D...

    Deb

    • Jun 8th, 2005

    I am providing the answer which I have taken it from Informatica 7.1.1 manual,  Ans: While running a Workflow,the PowerCenter Server uses the Load Manager process and the Data Transformation...

  •  

    How do you identify existing rows of data in the target table using lookup transformation

    Can identify existing rows of data using unconnected lookup transformation.

    Star Read Best Answer

    Editorial / Best Answer

    Answered by: SK

    • Aug 30th, 2007


    There are two ways to lookup the target table to verify a row exists or not :
    1. Use connect dynamic cache lookup and then check the values of NewLookuprow Output port to decide whether the incoming record already exists in the table / cache or not.

    2. Use Unconnected lookup and call it from an expression trasformation and check the Lookup condition port value (Null/ Not Null) to decide whether the incoming record already exists in the table or not.

    doppalpaudi

    • Jul 9th, 2010

    Lookup Transformation is used to cheek weather the data is present in target or not.This transformation is of 2 types 1. Connected lookup transformation2. Unconnected lookup transformation.By using above both we can cheek the target for data .

  •  

    Mapplet Transformations

    What are the transformations not used in mapplet and why?

    Star Read Best Answer

    Editorial / Best Answer

    gazulas  

    • Member Since Jan-2006 | Jul 2nd, 2008


    The following should not include in a mapplet.

    • Normalizer transformations
    • Cobol sources
    • XML Source Qualifier transformations
    • XML sources
    • Target definitions
    • Pre- and post- session stored procedures
    • Other mapplets

    The exact reason i why these should not used i dont know.

    sahan

    • Jul 16th, 2009

    Normalizer transformationXML source qualifierXML filesOther targets.If
    you need to use sequence generator transformation use the reusable sequence gen t/rIf
    you need to use stored procedure transformation, make the stored procedure type as normal.

  •  

    How many ways you can update a relational source definition ?

    Two ways 1. Edit the definition 2. Reimport the defintion

    Star Read Best Answer

    Editorial / Best Answer

    Sumithav29  

    • Member Since Jun-2009 | Jun 12th, 2009


    1. If you want to make minor changes to the source definition, you can edit the source definition (in source analyzer) then the Designer propagates the changes to all mappings using that source.
    2. If the source changes are significant, re-import the source and giving replace option while re-importing will reflect the changes in all the mappings where the source is used.

  •  

    What is granularity in INFORMATICA? How should be the level of granularity for a fact table as well as for a dimension table?

    P.Sivakumar

    • Sep 12th, 2007

    Granularity is a lowest level of information stored in databse that is called as granularity.

  •  

    what is the mapping for unit testing in Informatica, are there any other testings in Informatica, and how we will do them as a etl developer. how do the testing people will do testing are there any specific tools for testing

    sathish

    • Apr 6th, 2007

    Ok we will refer development document , Could you teach me how to write test cases for unit testing in development document before development and give me an example to sample test cases.

    Gopalakrishnan Kannan

    • Dec 21st, 2006

    Hi,To test the mapping, 1. Prepare mapping document. This mapping document explains start to finish of the entire transaction. This document shld contains each column level transformation logic from s...

Showing Questions 41 - 51 of 51 Questions
First | Prev | Next | Last Page
Sort by: 
 | 
Jump to Page: