Skip to content

Spark Syntax

dantheking-crypto edited this page Jul 6, 2021 · 3 revisions

Intro to Spark

the main() function

Like most other programming languages, this is where you put your "main" code, as indicated by the name. For example:

main() {
   #code goes here#
}

More generically, anything that goes between two {s is a block; main() is merely a special type of block.

Parts of a Block

At the top of a block, you'll want to declare your variables. Here is the syntax for that:

{
    vars:
       type1: varname1, varname2;
       type2: varname3, varname4, varname5;
    # rest of your code here #
}

Next, we have function definitions.

Defining a function

A function definition consists of the following parts, and in that order:

  • name
  • parameters
  • return type
  • body (which is just a block)
  • return value
    For example:
def foo (int: a; bool: b;) -> int {
	def sub (int: a) {
		program:
			print('{a}');		
        }
	program:
		print('Delegating print to sub-function: ');
		sub(a);
		print('Sub completed');	
} -> return 1

This is a slightly more complex function, so let's break it down.
First, we have the def keyword, which is used before every function declaration. Next, we declare a function foo, which accepts an integer a and a boolean b. We then use an arrow, then a type (in this case, int), to signify the return type. Next, we move on the the body block of foo.
Something interesting happens here; we declare a function named sub within our function foo! What's more interesting is that sub doesn't appear to have the arrow signifying return type. What does this mean? This signifies what is referred to in many other languages as a void function. It simply prints an integer a passed to it, and doesn't return a value. (You may have also noticed the program keyword; we'll talk about that next.) Moving back to foo, we see it prints a string, then calls sub, which we just went over. It then prints again, before returning a value of 1, which is signified by -> return <value>.

The program Keyword

You may have already noticed the program: keyword when we were talking about functions. You also may have already guessed that this signifies the main code of the block; everything between program: and the closing bracket is what will be executed.

The print function

The spark implementation of print is quite similar to that in python. Here's an example print statement that showcases how to use it:
print('Value of a: {a}; we can also add and multiply strings like so: {'adding' + ' strings' * 2}')