Execute Workflows Conditionally
I have 2 workflows named as W1 and W2 so I need to execute W2 but after W1 workflow executed successfully then only executed W2 workflow? How to do this one?
- Dec 22nd, 2022
We can trigger W2 after completion of W1 by using pmcmd commands in shell scripting.
- Apr 25th, 2019
Can we not use worklet to add two workflows and link them to run one after the another ?
What are the meta data of source you import while importing the relational source definition from database ?Read Best Answer
Editorial / Best Answer
Answered by: srinvas vadlakonda
- Sep 28th, 2006
- May 11th, 2022
- Jun 12th, 2009
ODBC data source, username, owner name, password. Once the connection is established, select the table.
Incremental Loading Flat File Data Capture
How can we do incremental loading if the source is flat file?
- Dec 31st, 2020
If your source flat file is always full file , based on the natural keys, you can identify the records by lookup on the target table , whether records are existing or new. If records are new, go ahead...
Update Records in a Flatfile
Why we use joiner transformation to update records in a flatfiles?
- Oct 15th, 2020
Flat file cannot be updated but it is always overwritten
Error: OBJM_54538 with informatica powercerter
Could not execute action...
The Service Name_repository could not be enabled due to the following error: OBJM_54538 - Unable to connect to the repository Name_repository on database mailto:DESKTOP-4FVDDK@IPC.error: OBJM_54538
Please could you help me with this issue?
In which condtions we can not use joiner transformation(Limitaions of joiner transformation)?
Both pipelines begin with the same original data source. Both input pipelines originate from the same Source Qualifier transformation. Both input pipelines originate from the same Normalizer transformation. Both input pipelines originate from the same Joiner transformation. Either input pipelines contains an Update Strategy transformation. Either input pipelines contains a connected or unconnected...
- Jun 7th, 2019
We cannot use Update strategy in a pipeline when using joiner condition, because update strategy has property i.e treat row as which has insert,update,delete, which are processed at runtime, this limit joiner transformation to know exact properties values till runtime, and then proceed further.
- Sep 17th, 2014
We cannot use Joiner t/r immediately after sequence generator and upd strategy t/r as 1. Joiner will join 2 sources and it cannot join 3rd source as seq generator. Also if we select seq generator as 1...
SCD Type 2 Mappings
In SCD Type 2 Mapping, I Have 50 Records. How to compare without going record by record?
- Mar 12th, 2019
Use joiner transformation, instead of lookup to target table.
- Nov 1st, 2018
Use MD5 check constraint in Expression transformer
What is the difference between SQL Overriding in Source qualifier and Lookup transformation?
- Oct 4th, 2018
1. SQL Override is to limit the number of incoming rows entering the mapping pipeline - Lookup Override is to limit the number of lookup rows to avoid the whole table scan by saving the lookup time ...
- Nov 28th, 2013
using source qualifier u can extract the data from sources,in over ride u can write the queries to run in the data base,where as in sql t/f u can write and run the queries in database and get the res...
What r the options in the target session of update strategy transsformatioin?
Insert Delete Update Update as update Update as insert Update esle insert Truncate table
- Oct 1st, 2018
- Feb 3rd, 2006
Update as Insert:This option specified all the update records from source to be flagged as inserts in the target. In other words, instead of updating the records in the target they are inserted as new...
Index and Data Cache Files
Assuming that the master pipeline has 5 fields, two of which are part of the join condition and that all fields are connected downstream to the next transformation.then How many fields are in the index and data cache files?
a. 5 fields are in index and 3 in data cache
b. 3 fields are in index and 2 in data cache
c. 2 fields are in index and 3 in data cache
- Aug 27th, 2018
The answer sld be C
How do I pass a argument retrieved from a mapping to a command task?
In a unix environment I have an application that takes several arguments. One of these I get from a session mapping. How do I pass this argument into my application?
- Aug 12th, 2018
You can pass the session variable to workflow variable though post session variable assignment and then use the worklow variable as an argument in the command task
We can insert or update the rows without using the update strategy. Then what is the necessity of the update strategy?
- Aug 12th, 2018
1. You cannot perform DD_Delete unless you have update strategy.
2. While updating a record, ideally created date should not be changed. for that we definitely have to create two targets (insert and update)
- Feb 15th, 2008
In update strategy ,1)An insert in the source an insert in the target2)An update in the source is an update in the target update strategy provides 4 types of data driven functions 1)DD_INSERT2)DD_UPDA...
Dynamic target creation using source data
Suppose in Informatica a source file has 1 column(CITY) and datas are Pune, Kol, Pune. Want to load CITY data to dynamically generated target file(means it will create 2 targets, named Pune and Kol and in Pune target, value will be Pune and in Kol target, value will be Kol) ?
- May 24th, 2018
use this flow: sorter (on city) -> Expression (compare the current and previous cityname and based on that create the FLAG in another Variable port.) -> Transaction Control (IIF FLAG = 1, TC_COMMIT_B...
What are Dimensions and various types of Dimensions?
set of level properties that describe a specific aspect of a business, used for analyzing the factual measures of one or more cubes, which use that dimension. Egs. Geography, time, customer and product.Read Best Answer
Editorial / Best AnswerManishTewatia
- Member Since Jul-2010 | Jul 22nd, 2010
What Md.Rehman is trying to say is catagorizing the SCD which are the Slowly Changing Dimensions which are used to maintain historical data.
Dimension:A dimension is an organized hierarchy of categories, known as levels, that describes data in data warehouse fact tables
The various types of dimensions are :
1) Shared and Private Dimensions: Describes the basic differences between shared and private dimensions and their uses
2) Regular Dimensions: Provides information about regular dimensions and their variations
3) Parent-Child Dimensions: Describes the creation of parent-child dimensions and identifies their advantages and restrictions
4) Data Mining Dimensions: Describes the creation of data mining dimensions and identifies advantages and restrictions to their use
5) Virtual Dimensions:Describes the creation of virtual dimensions and their advantages and restrictions
6) Dependent Dimensions: Describes the creation of dependent dimensions and identifies their advantages and restrictions
7) Write-Enabled Dimensions: Describes the creation of write-enabled dimensions and identifies their advantages and restrictions
- Mar 1st, 2018
Dimensions are of several types but we use few commonly which are, 1. Confirmed dim. dim. shared by all fact tables. ex- time dim. 2. Slow changing dim. it is of 3 types : SCD1. SCD2, SCD3., 3. Role ...
- Dec 23rd, 2010
Dimensions are of 3 types mainly in case of slowly changing dimensions: Type 1 -> Does not maintain any history and is update only. --> this is normal practice mappings.Type 2-> Maintains full histor...
Write a query to retrieve latest records from the target table means if we have used scd2 version type of dimension, than retrieve the record with highest version no.for eg verno id loc 1 100 bang 2 100 kol 1 101 bang 2 101 chenwe have to retrieve 100/kol and 101/chen. how it is possible through query.
- Mar 2nd, 2018
What if the we have
Your query will retun 2,3 and we will get
2,101,Kol as output
- Jul 9th, 2012
Sorry for my previous post. I missed a bit of question.
select * from where version in (select max(version) from group by id, loc_no)
What is the difference between Session task and Command task?
- Feb 15th, 2018
Session is one of the task. Task may be session, shell command or email task. When we create task there will be drop down which task should be created. A session is a type of task, similar to other ...
- Mar 12th, 2009
Session task: A session task is associated with a mapping. It helps in running the mapping. We need to specify the source and target connections etc in the Mapping tab and configure the properties a...
How many types of sessions are there in informatica.please explain them.
- Dec 6th, 2017
Reusable and non-reusable
- Feb 18th, 2017
There are totally two types of session,
1.Reusable_session: When you create session under task developer, this is reusable session.
2.Non_reusable_session: When you create session under workflow this is non_reusable session.
Transformation to Load 5 Flat files
What is the method of loading 5 flat files of having same structure to a single target and which transformations will you use?Read Best Answer
Editorial / Best Answersarun5
- Member Since Feb-2008 | Mar 13th, 2008
Guys I have got the answer for which I asked..here you go
This can be handled by using the file list in informatica. If we have 5 files in different locations on the server and we need to load in to single target table. In session properties we need to change the file type as Indirect.
(Direct if the source file contains the source data. Choose Indirect if the source file contains a list of files.
When you select Indirect, the PowerCenter Server finds the file list then reads each listed file when it executes the session.)
am taking a notepad and giving following paths and file
names in this notepad and saving this notepad as
emp_source.txt in the directory /ftp_data/webrep/
In session properties i give /ftp_data/webrep/ in the
directory path and file name as emp_source.txt and file type as Indirect.
- Nov 27th, 2017
You should use a indirect file (which contain path and name of all the 5 files) as source. Transformation depends on the business logic.
- Dec 16th, 2016
When we have the same structured flat files, why don't simply use UNION make them a single file and then load to target instead of such complication? or UNION doesn't work on flat files?
PreSql in Source Qualifier and PreSQL in Target in Informatica
I am not able to understand the different between PreSql in Source Qualifier and PreSQL in Target in Informatica. I read many other posts they told
pre SQL means " SQL statement executed using the source connection,before a pipeline is run" and post SQL means " SQL statement executed using the source connection,after a pipeline is run".
So if all PRESQL run at same point...
- Nov 27th, 2017
Pre-SQL in SQ is use to run the SQL in source database and similarly preS-QL in target for target DB.
This is useful when the source and target database are different .
Create Session log as a Parameter in Informatica Cloud
How to create a session log as a parameter in informatica cloud?
- Oct 23rd, 2017
At session level if we click on general option you will find session log file, session log folder. We can perameterise those. (ex: $session log file)
Informatica Interview Questions