Input variable not does not show in main integration - For Each Parallel Task

I am following this documentation - https://cloud.google.com/application-integration/docs/insert-data-bigquery-for-each-parallel-task to build integration for adding rows to BigQuery. 

I encountered problems when I reached step 6 under "Configure For Each Parallel Task"

 
I am not able to see the variable from sub-integration available for mapping under "Where to map individual array elements" configuration. 
 
 
I would really appreciate your input to resolve this problem. I have attached screenshot for reference
Screenshot 2023-09-29 at 11.42.23 PM.png
 
 
 
 
 

 

Solved Solved
0 7 580
1 ACCEPTED SOLUTION

Hi piyushbhandari,

Thanks for the question. After looking into it, it appears we need to update our documentation as there are a couple missing or inaccurate items in it. We'll do that promptly, but in the meantime to help unblock you:

  • - In the "Create a sub-integration" -> "Configure the Data Mapping Task" section step 4, the variable should be marked as an input to the sub-integration by checking the "Use as an input to the integration". You can edit the variable by clicking the "More actions" ellipsis next to the variable name in the variable panel (left side of screen) and going to "View details". The checkbox is towards the bottom of the variable editor:brayton_0-1696253803693.png

     

  • In the "Set up the main integration" -> "Configure the For Each Parallel task" section step 5, you should select the "Run a single integration" instead of "Run all integrations with this API Trigger ID" and specify the sub-integration you created. After that, the input variables should be visible in the "Where to map individual array elements" dropdown referenced in step 6.

Apologies for the inconvenience and please let us know if you have any further questions.

View solution in original post

7 REPLIES 7

Hi piyushbhandari,

Thanks for the question. After looking into it, it appears we need to update our documentation as there are a couple missing or inaccurate items in it. We'll do that promptly, but in the meantime to help unblock you:

  • - In the "Create a sub-integration" -> "Configure the Data Mapping Task" section step 4, the variable should be marked as an input to the sub-integration by checking the "Use as an input to the integration". You can edit the variable by clicking the "More actions" ellipsis next to the variable name in the variable panel (left side of screen) and going to "View details". The checkbox is towards the bottom of the variable editor:brayton_0-1696253803693.png

     

  • In the "Set up the main integration" -> "Configure the For Each Parallel task" section step 5, you should select the "Run a single integration" instead of "Run all integrations with this API Trigger ID" and specify the sub-integration you created. After that, the input variables should be visible in the "Where to map individual array elements" dropdown referenced in step 6.

Apologies for the inconvenience and please let us know if you have any further questions.

Thanks for your response @brayton - That works!

Another issue came up while following Step 2 under "Test your Integration" in the same tutorial.

When I run the following command with the name of API trigger after pulishing the integration. I get syntax error in the command. I checked the count of braces but it looks fine to me. I have attached screenshot of error

Screenshot 2023-10-02 at 3.33.33 PM.png

 

Glad it worked! In your screenshot, it looks like the `triggerId`  is missing the leading quotation mark (`"`) - could you try adding it so it is valid JSON? 

Screenshot 2023-10-02 at 5.58.39 PM.png

 

 

 

 

 

 

 

Thanks for pointing that out @brayton  - I tried again but same error

@piyushbhandari  I think there may be some formatting issue with the bash script in our doc. If you copy as per below, does it work? It works in my testing. 

AUTH=$(gcloud auth print-access-token)
export SAMPLE_DOCS=$(jq $(r=$((RANDOM % 1000)) ; echo ".[$r:$((r + 3))]") < bq-sample-dataset.json | jq -Rs '.')
            
generate_post_data() {
  cat <<EOF
{
"triggerId": "api_trigger/process-records_API_1",
"inputParameters": 
  {
    "records": 
      {
        "jsonValue": $SAMPLE_DOCS
      }
  }
}
EOF
}

 

Hey @brayton 

Thank you for providing the script in the new form! I really appreciate it

When I run the above snuppet via script.sh in Cloud Shell it appears to be working as I don't see any error but when I run the curl command in post.sh script file - I get an authentication error. I have attached the screenshots for after runnning both the snippets. Please let me know if I am doing domething wrong here. 

post.sh file containing curl command from "Test your Integration " section of tutorialpost.sh file containing curl command from "Test your Integration " section of tutorialscript.sh file containing command snippet from "Test your Integration " section of tutorialscript.sh file containing command snippet from "Test your Integration " section of tutorial

 

@piyushbhandari 
Haven't had a chance to try this myself with your setup, but since you're receiving the 401 I'm thinking that your `post.sh` doesn't have access to the `AUTH` variable defined in `script.sh` since it hasn't been exported. Perhaps exporting it (or moving that line to the  `post.sh` before curl) would correctly populate the header. Also, I see there was an error `generate_post_data command not found`  - I think this is a similar issue and you may need to export the function from the script so its accessible by the other script.