Skip to main content

Possibilities for importing Slack and similar information into LLM

Proposed by Athonia [REMOTE]

LK102

Where will the conversation continue?
slack #LLM-productivity channel
I have proposed a breakout meeting room for the #itunconf (Unconference) today at 3:05 pm!

Specifically I would like to ask others whether it'd be desirable to pull in data sources such as our historic slack channel conversations and Service now tickets into an LLM, perhaps one like the new Google Notebook LM (https://notebookLM.google/) such that we can query that data in a conversational chat kinda way.

We would need to address privacy/security first and foremost.

There's a channel that I'm part of that I think others could enjoy: #LLM-productivity on slack.
Please join!
Notes

Co-pilot has helped a bit here by providing Key Points & Discussion topics:

  1. Introduction to LLM Productivity:

    • Briefly explain what Large Language Models (LLMs) are and their potential for enhancing productivity.
    • Highlight the specific use case of querying historical Slack conversations and ServiceNow tickets.
  2. Privacy and Compliance Concerns:

    • Discuss the importance of data privacy and compliance when using LLMs.
    • Highlight key regulations such as GDPR, CCPA, HIPAA, and how they impact the use of LLMs[1].
  3. Data Security Measures:

    • Emphasize the need for robust data governance policies[1].
  4. Choosing the Right LLM Platform:

    • Compare different LLM platforms based on features, security, and compliance capabilities.
  5. Integration with Existing Systems:

    • Explore how LLMs can be integrated with existing tools like Slack and ServiceNow.
    • Discuss potential challenges and solutions for seamless integration.

Options for LLM Platforms

  1. Google Notebook LM:

    • Features: AI-powered research assistant, summarization, and insights from uploaded documents[2].
    • Pros: Easy to use, integrates well with Google Workspace.
    • Cons: Privacy concerns with data uploaded to Google’s cloud[2].
  2. AWS Bedrock:

    • Features: Fully managed service offering a choice of foundation models, customization with your data, and integration with AWS tools[3][4].
    • Pros: Strong security and compliance features, serverless experience, broad model choice[3][4].
    • Cons: Requires familiarity with AWS ecosystem.
  3. Microsoft Azure OpenAI Service:

    • Features: Access to OpenAI’s powerful models, integration w/Azure services & enterprise-grade security.
    • Pros: Strong compliance and security, seamless integration with Microsoft products.
    • Cons: Potentially higher cost for extensive use.

Discussion Topics

  1. Use Case Scenarios:

    • Discuss specific scenarios where querying historical data with LLMs can improve productivity.
  2. Data Privacy and Compliance:

    • Delve deeper into the regulatory requirements and how each platform addresses them.
  3. Security Best Practices:

    • Share best practices for securing data when using LLMs.
  4. Platform Comparison:

    • Compare the features, pros, and cons of each LLM platform.
  5. Integration Strategies:

    • Discuss strategies for integrating LLMs with existing systems and workflows.
  6. Future Trends:

    • Explore future trends in LLM technology and their potential impact on productivity.


References

[1] AI and Data Protection: Strategies for LLM Compliance | Normalyze

[2] NotebookLM - Easy With AI

[3] What is Amazon Bedrock? - Amazon Bedrock - docs.aws.amazon.com

[4] Build Generative AI Applications with Foundation Models - Amazon ...