I built this little example of how to use www.robbie.run to perform batch LLM inference on customer service conversations to identify topics and sentiment.
Here's what I used:
Thanks to Chris Crosby for the prompts.
Example Customer Conversations in the .csv:
Inference running on the remote machine!
Json Output with topics and sentiment.
Here's the repo link: https://github.com/Positron-Networks/robbie-examples/tree/main/command_runner/llama3-batch-inference