heroPhoto by Andrew Neel on Unsplash

A web dev engineer's review: Learning Swift using ChatGPT

ChatGPT was released back in November 2022, and it caused quite a stir online. Scaremongers have been saying it will change everything and AI is ready to take jobs from us. There are also sceptics which said it won't change anything. I was interested in how good ChatGPT was, so I decided to do an experiment and have been testing it out myself in the past couple of months. I wanted to see if it could help me to learn iOS mobile app development and how it faired compared to widely used methods such as "googling" and following text or video tutorials.

My background

I've mainly been working in the web sphere during my career, which means I know next to nothing about native app development, especially in Apple's developer ecosystem. I briefly tested out Android and hybrid app development a few years back, but never checkout any of Apple's platforms. So I came into this experiment with zero knowledge about Xcode, the IDE for developing apps for any of Apple's platforms, or Swift, the programming language for developing apps. Thus this should be a perfect scenario to test out ChatGPT's limits.

Chosen project

What better way is there to learn about app development than creating an app? I love reading Chinese online novels, unfortunately, there isn't a good free reading app on iOS (Android users have a lot more choices). So I decided this would be my project to learn Swift. The app would essentially be an HTML scrapper that given some instructions (e.g. URLs and CSS selectors) can parse HTML into data which is then rendered on the screen. It'll need to be able to search multiple websites for books and pull information about a book such as name, author, table of contents URL, etc. Then using the table of contents URL it needs to parse a list of chapters, and finally, get the content from the currently reading chapter. Some other features will provide a better reading experience as well, such as the ability to change background colour, support multiple fonts, text-to-speech, data caching, etc.

In my head, a lot of the functionalities boiled down to UI-specific stuff, and I was pretty confident I'd be able to get them implemented without much issue. Others might be more difficult, especially learning how to parse HTML in Swift.

In summary, I was very curious to learn more about both ChatGPT and iOS app development. With my motivation high, I cracked on at the beginning of Feb 2023.

ChatGPT only provide text-based answers

After I opened up Xcode, I was very lost. Although some of the interface made sense to me, most of it was alien as I've always preferred text editors such as Sublime Text and VS Code instead of IDEs. This highlighted a massive limitation with ChatGPT, it doesn't do images. When learning something new, images can be worth more than a thousand words. Although it gave a lot of good background information about Xcode and some high-level features, it did not provide a good basic Xcode interface walkthrough. I had to result to watching a few YouTube videos showing me the basics of the different UI buttons and panels of Xcode.

Maybe one-day ChatGPT 10 will be able to utilise more data formats but right now it is severely limited by the fact it can only communicate via text. This means for anyone who intends to use ChatGPT that relies heavily on visual aspects, might be disappointed with what ChatGPT can do.

ChatGPT is amazing at answering basic questions

I was massively impressed with how easily ChatGPT answered my basic and stupid questions early on regarding both Swift and SwiftUI. It not only generated code to further illustrate its points but was also able to contextually continue the conversation when I asked follow-up questions.

There is an example where I asked it the difference between @State and @StateObject in Swift. Then without providing any context, I asked it to update its code example in the previous answer to showcase a simpler use case of @State.


With ChatGPT's help, I made a lot of progress at the beginning and quickly built up a basic book page screen that contained book cover images, author, word count, description of the book, etc.

ChatGPT even helped me to correct my Xcode configuration when I had to request data from an HTTP URL instead of HTTPS. It turns out Apple blocks non-secure connections by default, and unless I add an exception in my Info.plist the app refused to make the connection.

ChatGPT struggles with complex code problems

Later, ChatGPT showed its limits when I hit a more complex problem. I wanted to take all the text from a single chapter (long string) and split it up into multiple chunks, where each chunk fits within the current available screen space. I needed this because I want to allow people to swipe horizontally and flip through the pages like many other reading apps.

It turns out this is a non-trivial problem in Swift, and I struggled to get ChatGPT to give good answers. Some of the shortcomings of ChatGPT include:

  • ChatGPT gave answers that did not satisfy my requirements. I tried to reword my request differently but it was never able to provide the code to split up the pages efficiently. It only managed to provide several half-complete and less optimal solutions.
  • Some answers contained basic coding errors such as referring to APIs that did not exist in Swift. Interestingly, when I pasted in the Xcode error it was able to recognise its mistakes and apologised.
  • ChatGPT started losing context between questions. This not only meant my follow-up question's changes were added to some new snippet of code or that changing the previous code introduced other errors. But most often, it would give me brand new code solutions out of nowhere when asked to change something in the previous code example.
  • It was trained using old data (limited knowledge of the world and events after 2021) which means ChatGPT is not aware of the newer programming language changes. This can lead to some silly answers sometimes by returning deprecated code, etc.

This was a problem that I stuck on for days, ChatGPT and I went around in circles and it was frustrating. From this experience it is easy to see that ChatGPT is just a better Google search, it understands the question better and filters the results out with more context. But at the end of the date, it is just that, a better Google and it is not perfect.

ChatGPT can be very slow

There have been times when I simply gave up on asking ChatGPT questions because it was taking a very long time to respond. This is especially true when I used it during peak hours (free tier). However, its normal responding time can be long as well. Since it progressively generates its answers, where each word doesn't take long to generate but the time to generate them does not get any faster. This always means complex questions that require longer answers will take longer to generate than short answers. This is very different to Googling for results, some of the results' relevance level might be low, but it'll return results quickly.

When ChatGPT is working well, having it on the side feels like doing pair programming with another engineer, bouncing questions off it is a great way to learn. However, often the feedback loop was too long, which ended up taking me out of the "zone" and I found myself juggling more ideas in my head than usual as I try to break down questions into more manageable chunks for ChatGPT. Then holding on to the extra information in my head as I watch it slowly generate each word back to me.

I slowly stopped using ChatGPT

As I got better and more comfortable with Swift and Xcode, I used ChatGPT less and less. This should highlight the fact ChatGPT is just another tool in an engineer's toolbox and another source of information like Google results, YouTube videos, etc.

At the start of the project, ChatGPT was contributing more to the project than I was. But as soon I started grasping the basics of Swift, I soon took over and only asked clarification questions such as "Is it better to use @AppStorage or UserDefaults in Swift" to confirm my understanding and avoid bad practices. ChatGPT's answers always made me smile by returning diplomatic answers that explain both concepts and conclude that "it was dependent on my use case".


In summary, I have been very impressed with ChatGPT in the past couple of months. It has helped me work through some obscure issues, but at the same time when it came to complex programming problems, it struggled. So I had no choice but to architecting the app and take on the complex issues myself.

I have to admit ChatGPT made coding more fun, especially when working on a personal project when there are no colleagues to chat with about approaches and problems. When I get stuck on a coding problem I often find it hard to find the motivation to continue. ChatGPT has made it less daunting when I hit a brick wall. It is like a patient and knowledgeable friend, guiding and pushing me towards my goal.

Finally, in terms of the project, I'm happy to share that after around two months the app is almost done. There are a few more features I want to implement, then I'll be asking ChatGPT's help to refactor the code as right now it is a mess!


The media love to add "AI" to anything that isn't achieved by a human, but we should be clear that ChatGPT is strictly not a true AI. It is still based on a Machine Learning data model, albeit the best one in the world so far. It has no consciousness and when it replies it is not writing out of its understanding of the world. This showed when I asked it the complex code problem explained above. ChatGPT is great at putting publically available knowledge together in a short and easy-to-consume result, aka generating information, but it cannot "create" anything that never existed on the internet. Again, my "split this text into multiple pages" problem was a good example of this.

This also led me to question what would happen to the future ChatGPT versions. Before 2021, there was little to no "AI" generated content online (at least good content). But with the release of ChatGPT, I can imagine the internet will be plagued with generated content on so many blog posts (I'm referring to people typing a title into ChatGPT and then publishing its response as a blog post). I am very curious how future "AI" will be trained, because if it starts to consume the content generated by other "AI" then it would go into a self-approving spiral. If a wrong piece of information appears frequently enough in its training data then it could be considered a fact, just look at the Google’s AI chatbot Bard makes factual error in first demo example. But since the authors of ChatGPT have already altered how it responds to some questions, I'm sure it's not an impossible task, may just require a lot of manual tinkering.

Today I saw an article about how good ChatGPT was at medical diagnosis The newest version of ChatGPT passed the US medical licensing exam with flying colors — and diagnosed a 1 in 100,000 condition in seconds. Based on my coding experience with it so far, ChatGPT might be a good place to get some initial ideas about something, but I wouldn't trust it with my medical diagnosis yet.

Note: this article was written with zero help from ChatGPT

Buy Me A Coffee