Informatica – Loading a Flat File into a target Database

Loading data from a Flat File into the database is a frequent task for all Database/data warehouse developers.
This article details how to do the same using Informatica Powercenter.

Table Creation and Source File.

The following is the script I used to create the table. As I already have an EMP table in my application, I used SCOTT_EMP as the name for this table. If you have worked with Oracle Database before, you might notice this is the default emp table that comes in demobld.sql file.

This is the CSV File that I’ll load into the table above. You can download it if you wish to follow along these steps and load the data yourself.

emp_data

And here are the contents of the file…

As you can see, the fields in the file are delimited by comma (hence CSV file ). The first row in the file has the names of the fields (header) and the actual data starts at row no. 2. If you have a different file format that you are trying to load (with a different delimiter), I’ll let you know in a short while where you can specify that.

Source, Target and Mapping

The first thing that you’d need is to import the source and Target definition.

  1. Once logged into Designer, go to Tools> Source Analyzer
  2. Click Sources> Import from File
  3. Select your file and the following options.
    1. Delimited (we’ll select the delimiter later)
    2. Name for this Source => FF_SCOTT_EMP (FF indicates it is a flat file source, and then the name of the table)
    3. Start import at Row => 1 (Since we have headings at row 1 and data starts at Row 2)
    4. Check import Field Names from First Row. Click Next
    5. Select the appropriate Delimiter (in our case, COMMA) and make sure the data is aligned right In the GRID. Click Next.
    6. Click on each field and make sure you have the datatype and precision selected correctly.
    7. Eg, EMPNO, DEPTNO should be nuemeric, Hire Date should be Date and so on. Click Next and confirm to import the Source Definition.
    8. Go to the properties tab and Change the Date format used to “MM/DD/YYYY” instead of the default format.
    9. Save the changes to the folder (repository -> Save or CONTROL+S).
  4. Once logged into Designer, go to Tools> Target Designer.
  5. Click on targets -> Select import from Database.
  6. Enter your Database Connection Details, Select your table and Click Next.
  7. Your target will be imported. Save the changes
  8. Now that the Source and target have been imported, the next step is to map them.
  9. Click Tools > Mapping Designer, Click on Mappings > Create to create a new mapping.
  10. Let’s name it m_SCOTT_EMP (standard: m_<<Target_Name>>)
  11. Drag and drop the Source from the “FlatFile” group in your folder . Drag and drop the target, Scott_emp into the mapping Designer.
  12. Map the corresponding Fields and Save the changes.

Session and Workflow

Create a session ( Workflow Manager ) using this mapping, say s_m_SCOTT_EMP. Change the following parameters for the session.

  1. Click on Connections tab and select the appropriate connection name for the session. (if you don’t have one, go to Connnections Menu and add the necessary connection).
  2. Include the directory name and the file name in the “Source File Directory” and “Source File Name” under the mapping tab for the source.
  3. In the config Object Tab, set “Number of errors” to 1.

Create a workflow with the start task that links to this session and run the workflow.

Verify the data in the target.

Connect to the Workflow Monitor and Check your workflow’s session log to make sure everything is as expected…More specifically, you should look for the following lines at the end of your session log.

Hope this helps..
Rajesh.

  • radha krishnan

    Hi rajesh your sessions articles and posts with related to informatioa helps a lot to me..thank you so much..

  • sreehari chepuri

    when data moves flat file into target database,
    my error is
    input success rows 14
    output success rows 0
    total transformation errors rows 14
    in,my flat file date is there.
    i want to request u,
    please tell me the total process how to get total rows into the target database.

    • Rajesh

      “total transformation errors rows 14″
      This is where you should start. See the session log file and why the rows are erroring out. (and why a particular transformation is failing.).
      If you already have the error, post in in the reply here.

  • Abhirup

    Hii, I am using expression transformation to load the data excel(CSV) to Oracle but it is not working, I guess it because of data type miss match, Could you please help to resolve this issue?

    • Rajesh

      “..but it is not working” — Does the session fail? or Does the session succeed and you see no data? Or do you see a different number of rows than what you expect?

      “I guess it because of data type miss match” — Please look at the session log file and look at the actual error (if the log file is huge, sort it by severity and see the top results). Guessing the problem and making random changes will cause more issues than those that currently exist.

  • DK

    Hi Rajesh,

    Thanks a lot for posting the articles, really it helps a lot.

    From
    DK

  • vinesh

    Hi,
    how dou load a file to table in informatica. I want to know the generic steps in testing perceptive.I am using ALM QC.Its to write testing steps and parameters also

  • Naresh Babu Kotha

    Good one

  • sruthi k

    HI,
    I hav a file which i loaded the data yesterday again a new records as come today to a file how to load that new data to the same target (oracle/file)

  • sruthi k

    HI,
    I hav a file which i loaded the data yesterday again a new records as come today to a file how to load that new data to the same target (oracle/file)

  • Loganathan

    Hi,
    Logan here, I got an error on above.
    In source for hiredate is date/time datatype and the precision is 29. But, when i am creating a target file with DB it shows in date datatype and the precision is 19. hence, the system showing error.

    Can anyone help on this…..

    • Gautam Kumar

      Hi Logan,

      After importing Source, Go to the properties tab and Change the Date format used to “MM/DD/YYYY” instead of the default format. Also in the input flat file provide the date in the same format. Informatica will take care of rest of the conversion.

      Gautam

      • Loganathan

        Gautham,

        Thanks for comments. Need one more advice.

        In update strategy TO i need to filter the duplicate records and store into a separate table.

        Please find my scenario below.

        Create a look up transformation on the columns empno, doj of
        the target employee table (TGT_EMP_LKP)

        And load the records from the source file. The new records
        from the file has to lookup the records which are already in the target table
        based on the look up condition and according to that they have either update if
        the record already exists if not insert them.
        “Also create a
        new target table(ERR_TGT) to insert the duplicate/bad records from the file in
        to this”

        I have done with 4th line of scenario. And please advice how to filter duplicate records and store into a seperated table(ERR_TGT).

        Regards,
        Logan

  • Loganathan

    Hi All,

    Please advice me how to filter duplicate records from flatfile(source) and store into a separate target table using update strategy TO.

    In update strategy TO i need to filter the duplicate records and store into a separate table.

    Please find my scenario below.

    Create a look up transformation on the columns empno, doj of
    the target employee table (TGT_EMP_LKP)

    And load the records from the source file. The new records
    from the file has to lookup the records which are already in the target table
    based on the look up condition and according to that they have either update if
    the record already exists if not insert them.
    “Also create a
    new target table(ERR_TGT) to insert the duplicate/bad records from the file in
    to this”

    I have done with 4th line of scenario. And please advice how to filter duplicate records and store into a seperated table(ERR_TGT).

    Regards,
    Logan