Answered Questions

  • Single row converted into multiple rows using transformer stage

    input ------------- name | no -------------------- Bose 1 Mani 2 Arun 3 Output ------------- name | no -------------------- Bose 1 Mani 2 Mani 2 Arun 3 Arun 3 Arun 3 to get the Using transformer stage help me ASAP


    • Feb 15th, 2018

    Tx => Functins=>string=>str( ,no)


    • Dec 20th, 2017

    Using normalize component
    in that transform < length( >
    and assign drop to drag input to output

  • Transformer

    I am having input--> I want Output like--> slno|Name slno|Name 1|X 1|X 2|Y 2|X 3|Z 2|X ...

  • Duplicate Record in Datastage

    I have one scenario below, how can we achieve. City1,City2,Distance ============== blr, pune,1000 km pune, blr, 1000 km As we have same data in 2 records but need to delete any one of the duplicate record.

    siva krishna

    • Jun 16th, 2017

    Hi, Ive solved this as mentioned below. seq stage (source)----->sort stage ------> transformer stage -----> remove dupli stage ---->seq stage(target) read the source and add FILENAME COLUMN column...

  • Print Minimum & Maximum Salary of Respective Employee

    I have an input file in the below format : NAME SAL ---- --- A 4000 B 3000 C 8000 A 2000 B 7000 C 5000 B 2000 C 9000 A 1000 If I will use Sort ---> Aggregator Group by then it will give 3 columns like NAME,MAX(),MIN() NAME MAX() MIN() ---- ----- ----- A 4000 1000 But my requirement is to generate the output which will give the maximum & minimum salary...


    • Apr 7th, 2017

    Use Seq File->Aggregator Stage (group by Salary, Emp_Name and find min, max salary)->use Pivot(horizontal Pivot)

  • Load 10 Input files into 10 Target tables at a time

    I have the 10 input file like F1,F2...F10 then I need to load these 10 input files into 10 target Output tables like T1,T2...T10. Here is the scenario for 10 tables But in future If i will receive 100 input files then I need to load it into respective 100 target tables. After loading the input files into target tables I need a confirmation in the respective target tables(By input File name) Please...


    • Jun 27th, 2017

    This can be achieved by creating multiple instance job and parameterizing the source file name, table name and also need to enable run time column propagation.


    • Apr 24th, 2017

    We can do that allowing multiple instances for a single job in the job properties tab.

  • Test Target Phone Number Format

    Source have phone number with 8 digit and target has 10 digit. How to test target phone number has 10 digit start from +91?


    • May 17th, 2016

    1. SELECT t.phoneno FROM target t, source s WHERE substring(t.phoneno,0,4)=+91
    2. AND +91||s.phoneno=T.phoneno

  • Select the Even and Odd records

    How to select the Even and Odd records in SQL server and in Oracle without using ROWNUM,ROWID


    • Aug 22nd, 2016

    SELECT customerid,case when customerID %2=0 then even
    when customerID %2 < > 0 then odd end as result from customers

  • For ETL testers

    Any One please explain me roles and resposibilities of ETL testers as well as explain me his work from starting point to end point in ETL testing Thanks..


    • Jan 5th, 2017

    ETL Testing Process Steps 1. Understand the end to end design. 2. Test Plan Preparation. 3. Test Scenario Preparation (optional) 4. Preparing the test queries to perform the following validations: &nb...

  • Datastage job scenario question

    My input has a unique column-id with the values 10,20, can i get first record in one o/p file,last record in another o/p file and rest of the records in 3rd o/p file?


    • Nov 25th, 2018

    If your output file is sequential file then you can you use filter property .


    • Nov 25th, 2018

    If the input has 2 partion then i would get 2 rows for lastrow() function. rest everyhing is correct

  • What is the Difference between a ODS and Staging Area


    • May 17th, 2017

    You mentioned ODS comes after staging area is not true i guess. Flow is like ODS->staging->DWH as per my knowledge. Kindly correct me if wrong.