package flink

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. All

Type Members

  1. sealed trait FlinkJobExecutor extends Serializable

    Different strategy for execution of Flink jobs in local mode and in cluster

  2. abstract class FlinkStreamlet extends Streamlet[FlinkStreamletContext] with Serializable

    The base class for defining Flink streamlets.

    The base class for defining Flink streamlets. Derived classes need to override createLogic to provide the custom implementation for the behavior of the streamlet.

    Here's an example:

    // new custom `FlinkStreamlet`
    class MyFlinkProcessor extends FlinkStreamlet {
      // Step 1: Define inlets and outlets. Note for the outlet you can specify
      //         the partitioner function explicitly or else `RoundRobinPartitioner`
      //         will be used
      val in = AvroInlet[Data]("in")
      val out = AvroOutlet[Simple]("out", _.name)
    
      // Step 2: Define the shape of the streamlet. In this example the streamlet
      //         has 1 inlet and 1 outlet
      val shape = StreamletShape(in, out)
    
      // Step 3: Provide custom implementation of `FlinkStreamletLogic` that defines
      //         the behavior of the streamlet
      override def createLogic() = new FlinkStreamletLogic {
        override def executeStreamingQueries = {
          val outStream: DataStream[Simple] =
            writeStream(
              readStream(in).map(r => Simple(r.name)),
              out
            )
          executionEnv.execute()
        }
      }
    }
  3. abstract case class FlinkStreamletContext(streamletDefinition: StreamletDefinition, env: StreamExecutionEnvironment) extends StreamletContext with Product with Serializable

    Runtime context for FlinkStreamlets

    Runtime context for FlinkStreamlets

    The FlinkStreamletContext provides the necessary context under which a streamlet runs. It contains the following context data and contracts:

    • An active StreamExecutionEnvironment that will be used to submit streaming jobs to the Flink runtime.
    • The Typesafe Config loaded from the classpath through a config method, which can be used to read configuration settings.
    • The name used in the blueprint for the specific instance of this streamlet being run.
    • A mapping that gives the name of the Kafka topic from the port name.
  4. class FlinkStreamletContextImpl extends FlinkStreamletContext

    An implementation of FlinkStreamletContext

  5. abstract class FlinkStreamletLogic extends StreamletLogic[FlinkStreamletContext]

    The FlinkStreamletLogic provides methods to read and write data streams, and provides access to the configuration of the streamlet, among other things.

    The FlinkStreamletLogic provides methods to read and write data streams, and provides access to the configuration of the streamlet, among other things. A FlinkStreamletLogic must provide a method for executing streaming queries that process the Flink computation graph. The method buildExecutionGraph has to be overridden by implementation classes that process one or more DataStream {empty}s. The resulting graph is then submitted by executeStreamingQueries to the Flink runtime StreamExecutionEnvironment to generate the final output. These jobs will be run by the run method of FlinkStreamlet to produce a StreamletExecution class, the StreamletExecution class manages the execution of a Flink Job. The FlinkStreamletLogic may contain instance values since it's only constructed in runtime. The Streamlet, however, is also instantiated during compile-time, to extract metadata, and must not contain any instance values.

    Overide the method buildExecutionGraph to build the computation graph that needs to run as part of the business logic for the FlinkStreamlet.

    Here's an example of how to provide a specialized implementation of FlinkStreamletLogic as part of implementing a custom FlinkStreamlet:

      // new custom `FlinkStreamlet`
      // define inlets, outlets and shape
    
      // provide custom implementation of `FlinkStreamletLogic`
      override def createLogic() = new FlinkStreamletLogic {
        override def buildExecutionGraph = {
          val ins: DataStream[Data] = readStream(in)
          val simples: DataStream[Simple] = ins.map(r => new Simple(r.getName()))
          writeStream(out, simples)
        }
      }
    }

Value Members

  1. object ClusterFlinkJobExecutor extends FlinkJobExecutor

    Execution in blocking mode.

  2. object FlinkStreamletRuntime extends StreamletRuntime with Product with Serializable
  3. object LocalFlinkJobExecutor extends FlinkJobExecutor

    Future based execution of Flink jobs on the sandbox

Ungrouped