In this recipe, we'll see how to simulate real-time data.
To step through this recipe, you will need Kafka and Zookeeper running on the cluster. Install Scala and Java.
Since the data is available in files, let's simulate the data in real time using a producer which writes the data into Kafka. Here is the code:
import java.util.{Date, Properties} import kafka.javaapi.producer.Producer import kafka.producer.KeyedMessage import kafka.producer.ProducerConfig import org.apache.spark.mllib.linalg.Vectors import scala.io.{BufferedSource, Source} import scala.util.Random object KafkaProducer { def main(args:Array[String]): Unit ={ val random:Random = new Random val props = new Properties props.put("metadata.broker.list","172.22.128.16:9092") props.put("serializer.class","kafka...