Hadoop + ApacheSpark deploy cluster?

Hi, just rolling into the architecture.
You want to deploy the hadoop for data storage, apache spark for processing.
How will the architecture from the point of view of the servers?
You need a lot of servers and collect them in cluster? Or will one cloud, where everything is spinning?
What is the right thing to do?
March 19th 20 at 08:40
0 answer

Find more questions by tags Apache SparkHadoop