Which tools do you use?
As regards prototyping and in the DevOps-mode, the best for me is – hands down – the mix of Jupyter, Python and Apache Spark, since they allow you to combine data preparation, analysis, visualization and documentation. And now that Spark has introduced the data frame API, you have to make almost no compromises on the performance side with Pyspark. And Keras gives you a really great API for tensor flow. But, of course, you still orient yourself towards the customer’s IT structure and how their production is planned – you have to be flexible in this, otherwise the migration effort just gets way too big.