ML infrastructure with Pyspark for Java backend

  • Context: Java 
  • Thread starter Thread starter Avatrin
  • Start date Start date
  • Tags Tags
    Java
Click For Summary
SUMMARY

This discussion focuses on creating a machine learning (ML) infrastructure using PySpark for a Java or C# backend. The user seeks tutorials on building a complete ML cloud infrastructure that allows Python-trained models to be accessed via Java or C#. Key recommendations include learning to create ML pipelines with PySpark and utilizing Java to call Python scripts for model predictions. Resources such as tutorialspoint.com and digital-thinking.de are suggested for further guidance.

PREREQUISITES
  • Basic knowledge of Scala programming
  • Understanding of fundamental database concepts
  • Experience with a Linux distribution
  • Familiarity with machine learning pipelines in PySpark
NEXT STEPS
  • Learn to create ML pipelines with PySpark
  • Research how to call Python scripts from Java
  • Explore batch processing techniques in Java for ML predictions
  • Review the tutorial on integrating deep learning models within the Java ecosystem at digital-thinking.de
USEFUL FOR

Data scientists, backend developers, and machine learning engineers looking to integrate Python-based ML models into Java or C# applications.

Avatrin
Messages
242
Reaction score
6
Lets say I have experience creating ML models in Python, and have decided on training my models on Spark using Pyspark. This will form part of an ML infrastructure for a website with a Java or C# backend. How can I make this work? I am a beginner when it comes to Spark.

I am looking for any tutorial(s) that show me how to create a complete ML cloud infrastructure which can be trained with Python but accessed through Java or C#.
 
Technology news on Phys.org
First, if you don't know much about Apache Spark you can read through this tutorial from tutorialspoint.com. As prerequisites before this reading, you must have some prior exposure to Scala programming, (at least) to the basic database concepts and some experience on some Linux distro.

Then - if you haven't already done it, you must learn how to make ML pipelines with PySpark. There are tutorials for this, like this from tutorialspoint.com. There are of course other good tutorials as well, which you can find by googling.

Now, for a Java backend you ask and assuming that you are thinking about writing everything related to your ML model(s) in Python and then calling a Python script in Java, you can write Java code to do some processing - for example some form of batch processing task(s) in order to do some predictions in a deep learning model , export the preprocessed data to .csv or .json format and then call your Python script from bash for instance, passing the parameters. You can take a look at http://digital-thinking.de/how-to-using-deep-learning-models-within-the-java-ecosystem/ example of using deep learning models in Java ecosystem at digital-think.de, in order to get the idea of the process.

Needless to say that in order to accomplish the specific goal(s) you have, you'll need to mix and match things accordingly. I don't think that you can find a start-to-finish tutorial for the whole thing regarding the specific goal you have in mind.
 
  • Like
Likes   Reactions: Avatrin

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 22 ·
Replies
22
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
6
Views
2K
Replies
3
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
7K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 18 ·
Replies
18
Views
9K
  • · Replies 6 ·
Replies
6
Views
2K