MarketplaceCommunityDEENDEENProductsCloud ServicesRoadmapRelease NotesService descriptionCertifications and attestationsManaged ServicesBenefitsSecurity/DSGVOSovereigntySustainabilityOpenStackMarket leaderBusiness NavigatorSovereigntyPricesPricing modelsComputing & ContainersStorageNetworkDatabase & AnalysisSecurityManagement & ApplicationsPrice calculatorSolutionsIndustriesHealthcarePublic SectorScience and researchAutomotiveMedia and broadcastingRetailUse CasesArtificial intelligenceHigh Performance ComputingBig data and analyticsInternet of ThingsDisaster RecoveryData StorageTurnkey solutionsTelekom cloud solutionsPartner cloud solutionsSwiss T Cloud PublicReferencesPartnerCIRCLE PartnerTECH PartnerBecome a partnerAcademyTraining & certificationsEssentials trainingFundamentals training coursePractitioner online self-trainingArchitect training courseCertificationsCommunityCommunity blogsCommunity eventsLibraryStudies and whitepaperWebinarsBusiness NavigatorMarketplaceSupportSupport from expertsAI chatbotShared ResponsibilityGuidelines for Security Testing (Penetration Tests)Mobile AppHelp toolsFirst stepsTutorialStatus DashboardSwitch of cloud providerFAQTechnical documentationNewsBlogFairs & eventsTrade pressPress inquiriesPodcastMarketplaceCommunity

0800 3304477 24 hours a day, seven days a week

Write an E-mail 

Book now and claim 250 € starting credit
ProductsCloud ServicesManaged ServicesBenefitsBusiness NavigatorSovereigntyPricesPricing modelsPrice calculatorSolutionsIndustriesUse CasesTurnkey solutionsSwiss T Cloud PublicReferencesPartnerCIRCLE PartnerTECH PartnerBecome a partnerAcademyTraining & certificationsCommunityLibraryBusiness NavigatorMarketplaceSupportSupport from expertsHelp toolsTechnical documentationNewsBlogFairs & eventsTrade pressPress inquiriesPodcast
  • 0800 330447724 hours a day, seven days a week
  • Write an E-mail 
Book now and claim 250 € starting credit
 

Artificial intelligence – sovereign data

Artificial intelligence graphic
T Cloud Public launched the LLM serving services in the AI Foundation product family, a pool of ready-to-use large language models (LLMs) and embedding models.


In this article you can read,

  • which business challenges amberSearch solves with AI,
  • why cloud and AI are a perfect combination and
  • how amberSearch solved the sovereignty requirements of its customers.

With the LLM Serving Service, amberSearch puts together a package for companies with sensitive data and allows the sovereign use of artificial intelligence.

Everyday business: The tiring search for information

“Just where did I put that information I really need right now?” That’s a typical everyday situation for many office workers.  Searching for enterprise information robs them of nearly a half-hour of working time on average – every day. In such scenarios, artificial intelligence can truly contribute to higher employee productivity. 

This is where amber Tech GmbH comes in, with its amberSearch service: “It’s our mission to break up the status quo of outdated B2B software solutions and give our customers the added value of ultramodern artificial intelligence (AI) and an intuitive user experience in our enterprise search products,” explains Philipp Reißel, co-founder and CEO of the AI specialist, which is head-quartered in Aachen, Germany. amberSearch makes AI accessible even for companies that don’t have giant budgets or immense IT resources. A 30-person team at the company is now working on making internal information accessible quickly and easily. “We offer out-of-the-box solutions that are fast and easy to roll out, but still generate major added value for companies,” adds CRO (Chief Revenue Officer) Bastian Maiworm. 

The business concept of using artificial intelligence for enterprise search – searching within a company’s internal resources – is winning people over: The name of the service has become synony-mous with the company’s name. The AI provider has built a large customer base through various industries and different company sizes. Its customers include large companies like Schüßler-Plan, Zentis, DB Regio, and Landmarken AG, as well as SMEs in the mechanical engineering, biotech, and financial fields. Pharmaceu-ticals and financial companies face stringent compliance and security requirements; such companies are very careful to make sure that their internal data remains internal. The key phrase here is “data sovereignty”.

In search of a scalable and sovereign cloud

“Setting up our own infrastructure management and hosting was out of the question – we wanted to concentrate on our customer projects and the evolution of our AI solutions, and avoid unneces-sary fixed costs,” says Maiworm. With this in mind, amber Tech quickly decided to go with the cloud. “T Cloud Public is a reliable platform that gives us the freedom to run our business. It scales with us and saves us from having to procure and run our own resources.” This also includes demands for GPU resources, to train and run their AI services. At the same time, however, the com-pany is facing increased demands for data sovereignty: AI users, particularly financial firms, want to ensure that amberSearch has no way to access their data, even theoretically.

Image Bastian Maiworm

With T Cloud Public and the LLM serving service from the AI Foundation product family, we’ve found an elegant answer to the strict requirements our customers face – especially in regulated industries – about data sovereignty when using our AI services.

Bastian Maiworm, Co-founder, amber Tech GmbH

LLM services from T Cloud Public

T Cloud Public gives us a simple, elegant solution to this as well,” explains Bastian Maiworm. In 2024, Deutsche Telekom launched the LLM serving services in the AI Foundation product family, a pool of ready-to-use large language models (LLMs) and embedding models that can be integrated seamlessly with customer AI applications through an API key. In this approach, leading open-source models from Meta or Mistral are hosted in T Cloud Public, while closed-source models from OpenAI, Google, or Anthropic are provided through third-party platforms. Each customer manages their API keys in a portal, where users are also administered and token usage can be displayed. “This is very convenient for users of T Cloud Public like ourselves. We can use these services as burst capacity through our own instal-lation.” In other words: Normally, amberSearch serves customers from within its own installation. In response to specific inquiries or high loads, this installation is extended seamlessly through T Cloud Public – and not only at the level of simple infra-structure resources (IaaS). When can this become necessary?

To understand, remember that amberSearch consists of two com-ponents. The original component is the model for searching and rating enterprise information, developed by amber Tech. The se-cond part acts as the user interface: An LLM processes the queries and formulates an answer from the search results. 

„The LLM serving service from Deutsche Telekom gives us access to the LLM pool, which is staged directly in T Cloud Public. When our customers have increased demands for security, they get the corresponding LLM services directly from T Cloud Public – with which they have already concluded an agreement for commissioned data processing. As a result, even amberSearch has no theoretical way to access the data.” This me-ans amberSearch users can satisfy even the highest requirements for data sovereignty.

AI and cloud in perfect harmony

amberSearch is an example of a perfect interaction between cloud computing and AI. The company is building and develo-ping its AI business model with the LLM serving service based on T Cloud Public, fully scalable and with adaptive costs. T Cloud Public also delivers another factor: “The LLM serving service means we can also meet our customers’ demands for high levels of security and sovereignty for their sensitive data.” In addition, amber Tech has the possibility of experimenting with different LLMs, to ensure that their AI services are always of high quality. Last but not least, amberSearch benefits – beyond the cloud and use-based LLMs – from Telekom’s customer and partner network. 


This content might also interest you
 

A woman and a man look at a virtual screen together

AI Foundation Services

AI Foundational Services allow the cost-effective use of LLMs on T Cloud Public. The use of LLMs with RAG shortens the time to a finished AI service by 60 to 70 percent.

 
Ready for the future of AI and HPC: T Cloud Public delivers even more computing power and efficiency with NVIDIA H100 GPUs.

Level up your AI and HPC Applications with NVIDIA H100 GPUs

T Cloud Public introduces the next Generation of GPUs from NVIDIA. H100 brings new power to your Artificial Intelligence projects and other high Performance use cases.

 
The summit in view: Leading public cloud providers in the German market. An overview of the ISG Provider Lens™ Multi Public Cloud Services 2024.

ISG Provider Lens: T Cloud Public once again leader in the German market

T Cloud Public stands out once again: In the latest ISG Market Report 2024, it is recognized as a leading European public cloud.

 

T Cloud Public Community

This is where users, developers and product owners meet to help each other, share knowledge and discuss.

Discover now

Free expert hotline

Our certified cloud experts provide you with personal service free of charge.

 0800 3304477 (from Germany)

 +800 33044770 (from abroad)

 24 hours a day, seven days a week

Write an E-Mail

Our customer service is available free of charge via E-Mail

Write an E-Mail

AIssistant Cloudia

Our AI-powered search helps with your cloud needs.