From: lexfridman

The landscape of programming languages and environments for AI and Machine Learning is vast and diverse. In a conversation with Jeremy Howard, founder of fast.ai, various programming languages and environments were discussed, highlighting their strengths and weaknesses in the context of AI development.

Historical Perspective

Jeremy Howard’s programming journey began with Microsoft Access, which he cites as his favorite programming environment. It allowed for creating user interfaces and tying data and actions together in a way that was unmatched by today’s standards [00:05:01]. This environment used Visual Basic for Applications (VBA), which, despite being a poor programming language, provided a fantastic environment for user interaction and data manipulation [00:05:27].

Transition to Modern Languages

Delphi and Pascal

Howard favored Delphi, created by Anders Hejlsberg, for its balance between ease of use and the ability to compile fast applications [00:07:00]. Delphi was similar to Visual Basic but offered a compiled fast version, which is comparable to C# or Java, minus a virtual machine [00:07:45].

Perl and Python

Howard’s use of Perl stemmed from its flexibility, especially when he was building the email company FastMail [00:13:28]. Python, however, has overtaken Perl due to its substantial data science libraries despite being less elegant [00:14:42]. Python’s flexibility and widespread community support make it a popular choice in deep learning today.

Contemporary Developments in AI Programming

PyTorch and FastAI

For his courses and research, Howard uses PyTorch due to its dynamic computation graph, which makes it easier for interactive computation than TensorFlow [01:09:32]. He developed fast.ai to streamline the learning process, allowing for state-of-the-art models to be constructed with minimal code [01:11:27].

Swift for TensorFlow

Swift is posited as the future for AI programming due to its potential for being “infinitely hackable” [01:05:12]. Swift proponents like Chris Lattner have been working on Swift for TensorFlow, leveraging advanced compiler technology to optimize tensor computations, which is crucial for AI optimization challenges [00:20:06].

Challenges in AI Programming

Computational Limits and Hardware

Current AI models often necessitate high computational power, which primarily relies on NVIDIA GPUs. Howard highlights the challenges in programming multi-GPU or distributed systems, which can often slow down development due to complexity [01:06:11]. Furthermore, the cost of devices like Google’s TPU, which offer improvements in speed, are hindered by programming limitations, where they are almost unprogrammable by the user [00:22:01].

Slow Iteration and Innovation Barriers

The difficulty in seamlessly combining swift theoretical AI advances with practical implementations is evident. Python’s inherent slowness hinders the innovation particularly in natural language processing and recurrent neural networks, areas critical for natural language challenges [00:17:02].

Conclusion

The evolution of programming languages and environments reveals a trajectory towards more integrated, user-friendly, and efficient tools capable of meeting the demands of AI research and deployment. FastAI and Swift are positioned as leaders, valuing accessibility and flexibility to empower domain experts, mirroring Jeremy Howard’s advocacy for practical utility over pure theoretical advancements [00:39:11]. The future of AI programming will likely continue this trend, addressing current computational and instructional bottlenecks while keeping a keen eye on ethical implications and societal impacts.