1

Step 1

Let’s start by making sure our miner node is in local mode.

If you are following the guide step by step, you should not have added a bootnode to the .env file yet. Meaning your miner node should be running in local mode.

If you previously connected to the testnet by adding a bootnode to your .env file, you will need to comment it out to run in local mode.

To ensure the miner node is in local mode, check the .env file for any bootnode entries. If you find any, comment them out to run in local mode.

2

Step 2

Verify that your X/Twitter scaper is working and returning data

Execute a curl request to the node running in local mode to ensure it retrieves X/Twitter data:

curl -X 'POST' \
'http://localhost:8080/api/v1/data/twitter/tweets/recent' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"query": "$Masa AI",
"count": 1
}'

You should receive a response similar to this:

{
  "data": [
    {
      "Error": null,
      "Tweet": {
        "ConversationID": "1828797710385942907",
        "GIFs": null,
        "HTML": "<a href=\"https://twitter.com/CryptoGodJohn\">@CryptoGodJohn</a> $MASA the leading token for <a href=\"https://twitter.com/hashtag/AI\">#AI</a> and <a href=\"https://twitter.com/hashtag/Data\">#Data</a> <br><a href=\"https://twitter.com/gesepolia Masafi\">@gesepolia Masafi</a>",
        "Hashtags": ["AI", "Data"],
        "ID": "1828900558452797478"
        // ... (other Tweet fields)
      }
    }
  ],
  "workerPeerId": "16Uiu2HAmSCQMh22Xmo1GMxXB73qRx3YaVqqL1UwTYn3iNvQLjPB5"
}

Verify that the workerPeerId in the response matches your node’s peerID.

3

Third Step

These are instructions or content that only pertain to the third step.

Security Considerations

  • Keep your X/Twitter credentials secure and do not share them.
  • Never commit your .env file with X/Twitter credentials to version control.
  • After successful setup and cookie storage, remove the TWITTER_2FA_CODE from your .env file.

Cloud-Based Scraping

If you are running a X/Twitter scraper in the cloud, it’s encouraged to use a residential proxy. Without a residential proxy, it’s likely your scraper will be blocked by X/Twitter, resulting in invalid credentials errors. Ensure you have a reliable residential proxy service set up before deploying your scraper in a cloud environment.

Resources

Twitter advanced API calls

Troubleshooting

If you encounter issues:

  • Ensure your X/Twitter credentials in the .env file are correct.
  • Check the node logs for any error messages related to X/Twitter scraping.
  • If running in the cloud, confirm your residential proxy is correctly configured and functioning.
  • If you’re experiencing frequent login requests or timeouts, try temporarily disabling 2FA, restarting your node to save cookies, and then re-enabling 2FA.