Data Flow - From Petals to a Job using a tPetalsInput

compared with
Key
This line was removed.
This word was removed. This word was added.
This line was added.

Changes (22)

View Page History
{section}
{column}

h1. Preliminary notes

This use case can only be reproduced with Talend Integration Suite.
People using Talend Open Studio can find an alternative in the use case "Data Flow - From Petals to a job using attachments".
{column}
{column:width=360px}
{panel}{toc}{panel}
{column}{section}

h1. Rationale

Send a data flow from Petals into a Talend job, that will insert these data into a database.
In the scope of this use case, we could imagine, even if it is not shown, that these data are transformed before being inserted.


This job has no context variable.

\\
{info}

In the scope of this use case, it is assumed there is a database *formationtalend* on the localhost, having a table named *customers*.
The schema of the *customers* table includes two columns named *CustomerName* and *CustomerAddress*, both being of type varchar(255).

{info}

h2. Creating the job

The job creation is detailled in the use case "A Simple Talend Job".
There is no difference.


h2. Exporting the job

Select the job and right-click it. Select *Export to Petals ESB*.
Update the target destination.
Let the job be exposed as a singleton.

Click *Edit the exposed contexts*.
A dialog shows up. Export the _outputLocation_ context as an *Out-Attachment*.

You should have the following dialog:




Click the *Export mode* column, and select *Parameter* in the combo box. Click *OK*.
The link label should be updated and indicate the number of exported contexts.



Click *Finish*.



h1. Deploying and testing in Petals


h2. Looking at the created archive generated WSDL

The created archive is a Petals service assembly.
More details are available in the documentation of the petals-SE-Talend.
In the created Petals service assembly, the most interesting thing to look at is the WSDL.
Indeed, the WSDL will determine the way the exported service will be called.

What must be taken care of is the jbi.xml and the WSDL files available in the service-unit.
If you open the created archive, it contains another archive. This second archive contains a jbi.xml file.
\\
The input message's description requires empty parameters.

The WSDL file exposes no parameter.
The possible input parameters are:
{code:lang=xml}

{code}


\\
And the output message includes the job's result and the output attachment.

{code:lang=xml}

{code}


h2. Deploying and testing this new service

SoapUI is used for this test...
To test this service, you can use a tool like SoapUI.
This way, you can see what the XML messages look like.

The first thing to do is to create a service-unit for the Petals-BC-SOAP component, that exposes (consumes) our _Talend job as a service_ outside the bus.
This step is not described here. You can take a look at the Petals-BC-SOAP documentation and the Petals Studio documentation.
Just make sure the SOAP configuration uses the InOut MEP.

\\
Now, your input message (in SoapUI) should look like this:

{code:lang=xml}
TODO
{code}

\\
Notice the XML shape.
The in-data-rows will be passed in raw mode to the job, and loaded into the job's flow by the tPetalsInput component.
Every in-data-row has the same list of children, each child being a column in the schema of the tPetalsInput.

Thus, the expected data schema is defined by the job, and not by the service's contract.
In fact, the service's contract is partially generated from this schema.
As said in the other use cases, it is the job's content which define what the service contract will be.


\\
The returned message, when everything works, is:

{code:lang=xml}
TODO
{code}
\\
If the job execution fails, the 0 is replaced by another integer, e.g. 1.
One failure reason can be, as an example, that the database is not started.

To determine the act cause of a problem, you would have to use logging features available in the Talend palet.
However, let's make it clear, the job's logs are managed independently of Petals and its monitoring capabilities.