Select Spark Project (Scala) from the main window.įrom the Build tool drop-down list, select one of the following values: Select Apache Spark/HDInsight from the left pane. Start IntelliJ IDEA, and select Create New Project to open the New Project window. Select Install for the Scala plugin that is featured in the new window.Īfter the plugin installs successfully, you must restart the IDE. On the welcome screen, navigate to Configure > Plugins to open the Plugins window. See Installing the Azure Toolkit for IntelliJ.ĭo the following steps to install the Scala plugin: This article uses IntelliJ IDEA Community ver. This tutorial uses Java version 8.0.202.Ī Java IDE. For instructions, see Create Apache Spark clusters in Azure HDInsight. Use IntelliJ to develop a Scala Maven applicationĪn Apache Spark cluster on HDInsight.Run the application on Spark cluster using Livy.Generate a jar file that can be submitted to HDInsight Spark clusters.Update Project Object Model (POM) file to resolve Spark module dependencies.
#Download spark scala jar in eclipse how to#
In these Tutorials, one can explore how to install and set up scala application in. Apache Spark Scala command using IDE, Welcome to the world of Scala Commands IDE tutorials used for spark developers. Setup Development environment on Windows. Neither is NetBeans IDE (please note that I used to work with them few years ago so I might be wrong). To me it’s neither a good IDE for Scala nor Spark development. Since Apache Spark is written in Scala (with some parts in Java) one could say that the best IDE could be Scala IDE (given the name of the IDE). Creating a Scala application in IntelliJ IDEA involves the following steps: And starts with an existing Maven archetype for Scala provided by IntelliJ IDEA.
The article uses Apache Maven as the build system. In this tutorial, you learn how to create an Apache Spark application written in Scala using Apache Maven with IntelliJ IDEA.