-
Notifications
You must be signed in to change notification settings - Fork 447
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does sparkmagic support dual scala/python spark session? #890
Comments
🤖 Hey there, @shay1bz! Great to see you back in the sparkmagic universe. How's everything in your coding cosmos? 🚀 Yes, it is possible to extend sparkmagic to support dual Scala/Python Spark sessions, given that Livy supports Scala, Python, and SQL statements in the same session. The current implementation of sparkmagic already includes mechanisms to handle different programming languages for Spark sessions. This is indicated by the To support dual Scala/Python Spark sessions more explicitly, you could enhance the Here is the critical part of the code that handles session creation with a specified language: @magic_arguments()
@argument(
"-l",
"--language",
type=str,
default=None,
help="Language for Livy session; one of {}".format(
", ".join([LANG_PYTHON, LANG_SCALA, LANG_R])
),
) And the session creation logic in the language = args.language
properties = conf.get_session_properties(language)
self.spark_controller.add_session(name, endpoint, skip, properties) This setup allows for the creation of separate sessions for Scala and Python, leveraging Livy's capability to handle different languages. To work with both Scala and Python code in the same notebook, users can create two sessions (one for Scala and one for Python) and switch between them as needed using the Regarding session switching, sparkmagic handles this by allowing users to specify the session they want to use when executing a command with the When adding a session, users can specify the language of the session using the %spark add -s my_session -l python -u https://sparkcluster.net/livy -t Kerberos -a my_user -p my_password When executing Spark code or SparkSQL code, users can switch between sessions by specifying the session name with the %%spark -s my_session As for the limit to the number of sessions that can be created and managed by sparkmagic, there is no explicit limit set within the The enforcement of session uniqueness is handled by checking if a session name already exists in the def add_session(self, name, session):
if name in self._sessions:
raise SessionManagementException(
"Session with name '{}' already exists. Please delete the session"
" first if you intend to replace it.".format(name)
)
self._sessions[name] = session Therefore, the limit to the number of sessions that can be created and managed by I hope this information helps! If you have any other questions or need further clarification, feel free to ask. Sources
|
@dosu-bot please elaborate on single session, multiple languages. Users can choose default language, for example Scala, and also execute pyspark cells with |
🤖 To execute PySpark cells with
%load_ext sparkmagic.magics
For example, if you have created a PySpark session named %%spark -s my_pyspark_session
numbers = sc.parallelize([1, 2, 3, 4])
print('First element of numbers is {} and its description is:\n{}'.format(numbers.first(), numbers.toDebugString())) This approach allows you to execute PySpark code in the same IPython notebook session, even if Scala is chosen as the default language for sparkmagic sessions. You can refer to the examples/Magics in IPython Kernel.ipynb for more examples and details. Regarding how sparkmagic handles multiple sessions, including sessions with different languages like Scala and Python, it allows the user to configure the session parameters and language through magics. The language for a session is set by calling a specific magic command with the language option. For instance, the The The In summary, sparkmagic handles multiple sessions, including those with different languages, by allowing users to configure session parameters through magic commands. The actual management of sessions, including language settings, is handled by the Livy server, with sparkmagic serving as the interface for communication and execution of commands within these sessions. You can refer to the sparkmagic/sparkmagic/kernels/kernelmagics.py for more details. Sources
|
As Livy supports scala/python/sql statements in the same session, is it possible to extend sparkmagic to support it also?
The text was updated successfully, but these errors were encountered: