2 Replies Latest reply: Sep 9, 2012 2:42 PM by Michael Tarallo RSS

    Operator  Assocation Issue

      Hi Community,

       

       

      I am new to expressor, And try to connect three operators with each other. as illustrated in attached image files.

       

      the connection two the 3rd generating error message as

       

      SampleDataflow1:

         Read File 1:

            Error: The Attribute "field1" is required by another operator downstream of this location, but this requirement has not been satisfied.

            Error: The Attribute "field2" is required by another operator downstream of this location, but this requirement has not been satisfied.

            Error: The Attribute "field3" is required by another operator downstream of this location, but this requirement has not been satisfied.

            Error: The Attribute "field4" is required by another operator downstream of this location, but this requirement has not been satisfied.

         Write File 2:

            Error: The Attribute "field1" is required by this operator, but this requirement has not been satisfied.

            Error: The Attribute "field2" is required by this operator, but this requirement has not been satisfied.

            Error: The Attribute "field3" is required by this operator, but this requirement has not been satisfied.

            Error: The Attribute "field4" is required by this operator, but this requirement has not been satisfied.

        • Re: Operator  Assocation Issue
          Diamantis Archontoglou

          I am not an expert, but it seems that the schemas does not match or at least this had occured to me when a had this kind of error messages.

          You should try to match the composite types of input 1 to outputs by editing the schemas.

           

          There is an excellent example by Mike Tarallo here:

          http://www.youtube.com/watch?v=C198eBIYMLQ&list=UUtLWV632L8lv2OhOrR9K4RQ&index=6&feature=plcp

          • Re: Operator  Assocation Issue
            Michael Tarallo

            Hello MOHAMMED,

             

            (Diamantis - thanks for your contribution - the video you referenced is defintiely a great way to learn about Schemas and Semantic Types)

             

            This is definitely an area that requires some training on the product but is also easily handled once you have a grasp on the use of Schemas and Semantic Types and how they represent data in a flow

             

            • If a schema used for writing to a target has attributes in it that are not referenced or mapped in the dataflow  (or read schema) you will see that error you are mentioning.

             

            Here are some things to remember:

             

            • Schemas are used by READ and WRITE operators only

            • Schemas describe the PHYSICAL EXTERNAL FIELDS of source or target data and map to a collection of LOGICAL INTERNAL ATTRIBUTES called a Semantic Type

            • The LEFT side of a Schema contains the external PHYSICAL FIELDS

            • The MIDDLE of a Schema contains the mappings
              (some can be edited to translate data from one data type to another)

            • You use mappings to map data from the PHYSICAL FIELDS to the LOGICAL ATTRIBUTES in the Semantic Type

            • The RIGHT side of the Schema contains the INTERNAL LOGICAL ATTRIBUTES you can call this the Metadata

            • Semantyic Types are used and referenced inside of Schemas - as local or shared composite types

            • The COLLECTION of logical attributes are called a Compostie Semantic Type

            • By default when creating a schema from a physical file, table etc - it creates a local copy of the Semantic Type which is called a Local Composite Type

            • Semantic Type data types are generic or known as primitive
              - String, Integer, Decimal, Double,DateTime,Byte

            • Semantic Types can be defined as "Actionable" because they contain "Active Metadata"
              as you can assign constraints, default values, corrective actions and error handling all within the Type

            • Semantic Types can be SHARED and ASSIGNED to other Schemas to be reused

            • NOTE: Shared Semantic types RETAIN the rules (constraints, default values, corrective actions and error handling) defined in them - so be careful when using them in Transform and other operators as the data flowing through them migh trigger the rule and cause an error during the flow.

            • A Transform Operator that is used to transform existing data or create NEW attributes can also SHARE those Ouput Attributes and then have them ASSIGNed to a Schema that will be mapped to the target

            • Semantic Types' attribute NAMES AND DATA TYPES MUST ultimately match the target schema it is writing to

            • Reject Ports on READ Operators that connect to WRITE operators will have a different Schema layout (than the original read schema) and need to be created from Upstream output. Here are the attributes that are produced from the reject port.

              RejectType  - type of reject
              RecordNumber - the number of the data record that was rejected
              RecordData - a comma delimited string of the actual data values rejected encapsulated in quotes
              RejectReason - the reason the data was rejected
              RejectMessage - system generated additional messagesReject Ports on WRITE Operators can use the same schema that is writing to the desired target

            • Reject Ports on WRITE Operators can use the same Schema Layout.

            Let us know if you have any questions.

             

            Mike T