<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Alexis Creuzot on Medium]]></title>
        <description><![CDATA[Stories by Alexis Creuzot on Medium]]></description>
        <link>https://medium.com/@alexiscreuzot?source=rss-57dd0a46a136------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Sat, 09 May 2026 12:38:47 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@alexiscreuzot/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[Reducing Core ML 2 Model Size by 4X Using Quantization in iOS 12]]></title>
            <link>https://heartbeat.comet.ml/reducing-coreml2-model-size-by-4x-with-quantization-in-ios12-b1c854651c4?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/b1c854651c4</guid>
            <category><![CDATA[coreml]]></category>
            <category><![CDATA[mobile-ml]]></category>
            <category><![CDATA[heartbeat]]></category>
            <category><![CDATA[ios-app-development]]></category>
            <category><![CDATA[ios]]></category>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Thu, 01 Nov 2018 14:41:01 GMT</pubDate>
            <atom:updated>2021-09-30T14:38:18.777Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*D_RHzFfpGfsnVqw-keq6Vg.jpeg" /></figure><p>This year, Apple introduced Core ML 2 at WWDC 2018, with a focus on making machine learning more flexible and powerful for developers to use.</p><p>With the release of this new and improved framework, Apple also announced a freshly updated version of <a href="https://heartbeat.comet.ml/using-coremltools-to-convert-a-keras-model-to-core-ml-for-ios-d4a0894d4aba">coremltools</a>. This handy Python library can be used to convert trained models into a Core ML format as well as making predictions directly from your machine.</p><p>For mobile applications using Core ML, one of the main burdens is the model size. A heavy app can discourage some users from downloading it. As such, developers often end up storing models in the cloud, costing both time and money.</p><h3>Introducing Quantization</h3><p><a href="https://heartbeat.comet.ml/8-bit-quantization-and-tensorflow-lite-speeding-up-mobile-inference-with-low-precision-a882dfcafbbd">Quantization</a> is one of the new features in the updated coremltools, and it can help solve this size problem by reducing — sometimes drastically — the size of Core ML models.</p><p>It works by trimming the number of bits used to describe weights in models. As some models can have millions of weights, shaving a few bits on each one can have a tremendous impact on overall size.</p><p>In iOS 11, models only used 32-bit floats to describe weights. This was later improved in 11.2 with the introduction of half-precision floats, using 16-bit for the same accuracy.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*TufzwxhxJ6IZhxKiyJP_mA.jpeg" /></figure><p>With <a href="https://heartbeat.comet.ml/machine-learning-on-ios-12-and-the-new-iphone-x-series-8419155ab5b6">iOS 12</a>, weights can now be encoded using any number of bits, all the way down to just 1-bit.</p><p>So now, instead of having the continuous representation of values weights would have with floats, we actually end up with a discrete subset. Of course, this will lower your prediction accuracy, but it’s up to you to consider what tradeoff you might be able to accept.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*mwb7rRrAG_lvwUAcWawSpg.jpeg" /><figcaption>A linear distribution 3-bit values</figcaption></figure><p>There are multiple ways to choose the values representing those new, quantized weights. The most straightforward is to distribute them linearly, like in the example above.</p><p>We can also have them distributed in other arbitrary ways using a <a href="https://en.wikipedia.org/wiki/Lookup_table">lookup table (or LUT)</a>, based on the particularities of your model.</p><p>In total, you have access to these 4 distribution modes : linear, kmeans, linear_lut, andcustom_lut.</p><h3>Quantizing your Core ML model</h3><p>Let’s dive into the code!</p><p>First there are a few pre-requisites:</p><ul><li>Must be working on MacOS 10.14 (Mojave)</li><li>Have the last version of coremltools installed. A simple pip install coremltools==2.0should do it.</li></ul><p>Once your environment is ready, you should be able to run the following script. Just edit the last few lines to feed it your own model filename, along with the combination of bits-per-weight and distribution functions you wish to use.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/939fe84316b355801a8f2309bd3bea81/href">https://medium.com/media/939fe84316b355801a8f2309bd3bea81/href</a></iframe><p>Right away, you should see that the less bits-per-weight used, the lighter the quantized file will be.</p><p>This script was used on one of <a href="https://itunes.apple.com/us/app/looq-ai-powered-filters/id1159704664">Looq’s</a> Neural Style Transfer models. It gives you an idea of the accuracy loss on a real-world use.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*6zC-J5MN4J7JRTNIFmzPJg.jpeg" /><figcaption>Neural Style Transfer model output using different quantization settings</figcaption></figure><p>The first image is the un-styled picture for reference. After it are displayed the outputs for original, 16-bit, 8-bit, 6-bit, 3-bit, 4-bit, 2-bit and 1-bit quantized models.</p><p>Now let’s look at the actual size difference between each of those files.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*LzwPoHa_4oovM35wvMsZ3A.jpeg" /><figcaption>Size per model in MB</figcaption></figure><p>In this case, we can see that the 8-bit model output is nearly identical to the original one. This means that we can get the model size from 6.7MB to 1.7MB—4 times lighter!</p><p>This example is, of course, very specific to NST models. A linear distribution won’t always work. Fortunately, you should be able to apply the same approach to your own models, as long as you have a good understanding of how those values should be distributed.</p><h3>Save your models!</h3><p>As of now, there is no way to “re-quantize” a model. Meaning that if you’ve quantized a model to 16-bit and later decide to make an 8-bit version, you won’t be able to do it unless you still have your original file.</p><p>I hope you found this article helpful. If you want to read more on machine learning (and more specifically, Core ML), have a look at my piece on <a href="https://medium.com/@alexiscreuzot/building-a-neural-style-transfer-app-on-ios-with-pytorch-and-coreml-76e00cd14b28">how to build a Neural Style Transfer app</a>!</p><p><strong>Discuss this post on </strong><a href="https://news.ycombinator.com/item?id=18354426"><strong>Hacker News</strong></a><strong>.</strong></p><p><em>Editor’s Note: </em><a href="https://heartbeat.comet.ml/"><em>Heartbeat</em></a><em> is a contributor-driven online publication and community dedicated to providing premier educational resources for data science, machine learning, and deep learning practitioners. We’re committed to supporting and inspiring developers and engineers from all walks of life.</em></p><p><em>Editorially independent, Heartbeat is sponsored and published by </em><a href="http://comet.ml/?utm_campaign=heartbeat-statement&amp;utm_source=blog&amp;utm_medium=medium"><em>Comet</em></a><em>, an MLOps platform that enables data scientists &amp; ML teams to track, compare, explain, &amp; optimize their experiments. We pay our contributors, and we don’t sell ads.</em></p><p><em>If you’d like to contribute, head on over to our</em><a href="https://heartbeat.fritz.ai/call-for-contributors-october-2018-update-fee7f5b80f3e"><em> call for contributors</em></a><em>. You can also sign up to receive our weekly newsletters (</em><a href="https://www.deeplearningweekly.com/"><em>Deep Learning Weekly</em></a><em> and the </em><a href="https://info.comet.ml/newsletter-signup/"><em>Comet Newsletter</em></a><em>), join us on</em><a href="https://join.slack.com/t/fritz-ai-community/shared_invite/enQtNTY5NDM2MTQwMTgwLWU4ZDEwNTAxYWE2YjIxZDllMTcxMWE4MGFhNDk5Y2QwNTcxYzEyNWZmZWEwMzE4NTFkOWY2NTM0OGQwYjM5Y2U"><em> </em></a><a href="https://join.slack.com/t/cometml/shared_invite/zt-49v4zxxz-qHcTeyrMEzqZc5lQb9hgvw"><em>Slack</em></a><em>, and follow Comet on </em><a href="https://twitter.com/Cometml"><em>Twitter</em></a><em> and </em><a href="https://www.linkedin.com/company/comet-ml/"><em>LinkedIn</em></a><em> for resources, events, and much more that will help you build better ML models, faster.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=b1c854651c4" width="1" height="1" alt=""><hr><p><a href="https://heartbeat.comet.ml/reducing-coreml2-model-size-by-4x-with-quantization-in-ios12-b1c854651c4">Reducing Core ML 2 Model Size by 4X Using Quantization in iOS 12</a> was originally published in <a href="https://heartbeat.comet.ml">Heartbeat</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Building a Neural Style Transfer app on iOS with PyTorch and CoreML]]></title>
            <link>https://alexiscreuzot.medium.com/building-a-neural-style-transfer-app-on-ios-with-pytorch-and-coreml-76e00cd14b28?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/76e00cd14b28</guid>
            <category><![CDATA[coreml]]></category>
            <category><![CDATA[machine-learning]]></category>
            <category><![CDATA[ios]]></category>
            <category><![CDATA[tech]]></category>
            <category><![CDATA[pytorch]]></category>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Mon, 27 Aug 2018 12:39:00 GMT</pubDate>
            <atom:updated>2021-04-21T18:10:42.406Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*_T4oPSWIn135jFG9lECL0Q.jpeg" /><figcaption>Original image by <a href="https://medium.com/u/1fd9daae7d53">joseph barrientos</a></figcaption></figure><p>In June 2017 Apple introduced CoreML, a framework designed to integrate Machine Learning models into iOS apps.<br>This opened a great deal of possibilities for developers, from image analysis to NLP (Natural Language Processing), decision tree learning and more.</p><p>During Spring 2018 I started to dive into CoreML after reading a few articles on another domain of Machine Learning : <a href="https://towardsdatascience.com/a-brief-introduction-to-neural-style-transfer-d05d0403901d">Neural Style Transfer</a>. This ended up mutating into a full-fledged project, <a href="https://itunes.apple.com/us/app/looq-ai-powered-filters/id1159704664?mt=8">Looq</a>.</p><h3>Alexis Creuzot on Twitter</h3><p>La Vilaine #rennes via ⁦@Looq_app⁩</p><p>In this article, I will explain the basic blocks required to create this kind of app, and hopefully pass a few of the things I learned along the way.</p><h3>Training an NST model</h3><p>In a nutshell, ML models are similar to functions : they take one or multiple inputs to returns one or multiple outputs. One big difference though is that as a developer you don’t write a model, you train it. The trained model can then be fed inputs which will be processed through its <a href="https://machinelearningmastery.com/introduction-to-tensors-for-machine-learning/">underlying tensors</a> to reach a result, or prediction.<br>Our knowledge is therefore shifted to a higher level task : implementing the training algorithm.</p><p>Multiple frameworks are available to achieve this : <a href="https://www.tensorflow.org/">Tensorflow</a>, <a href="https://keras.io/">Keras</a>, <a href="http://caffe.berkeleyvision.org/">Caffe</a>, <a href="https://pytorch.org/">PyTorch</a>… We will focus on that last one in this article, PyTorch, as it provides a strong GPU acceleration, which will reveal as an important feature in the next steps.<br>To get started, I recommend using <a href="https://docs.anaconda.com/anaconda/install/">Anaconda</a> which makes it very easy to manage your environment in case you have several conflicting dependencies on your workstation. If you have an NVidia Graphic Card, it’s very important that you also install the proper<a href="https://developer.nvidia.com/cuda-toolkit"> CUDA toolkit</a> for it to enable GPU acceleration during training.</p><p>Once your environment is set, you can go to <a href="https://github.com/pytorch/examples">PyTorch Github repo</a> which list multiple usage examples, one being a <a href="https://github.com/pytorch/examples/tree/master/fast_neural_style">Fast Neural Style</a> sample. Clone this repo onto your workstation and activate your environment. <br>You should now be able to try training your first model using the default parameters. This will start a training loop which will save a checkpoint every two thousands iterations.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/195ea7d4bc6a574b2c5ae5d7287200ef/href">https://medium.com/media/195ea7d4bc6a574b2c5ae5d7287200ef/href</a></iframe><p>Those checkpoint are nothing more than a model at a given number of iterations. Usually the more iteration, the better, but in our case we are aiming for beauty, which is subjective and difficult to measure without actually having a look at the result. <br>You can look at how <strong>neural_style.py</strong> implements the actual training and what makes it “improve” over the previous iteration, which uses multiple metrics like <strong>style-weight</strong>, <strong>content-weight</strong>, <strong>regression loss</strong> and more.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/d7cf3af6bd26982d505e9d2b16c7ec54/href">https://medium.com/media/d7cf3af6bd26982d505e9d2b16c7ec54/href</a></iframe><p>Trying with other style image, you‘ll find that tweaking those <strong>style-weight</strong><em> </em>and<em> </em><strong>content-weight</strong><em> </em>is a good place to start in order to improve the model predictions.</p><h3>Exporting your model to CoreML</h3><p>Here comes one of the tricky parts. PyTorch doesn’t natively support exports to CoreML (yet). Fortunately, you might have heard about an endeavour aiming at make Machine Learning models interoperable, the<a href="http://onnx.ai"> Open Neural Network Exchange</a>. <br>ONNX provides tools for importing and exporting models from almost any framework to the .onnx format (<a href="https://xkcd.com/927/">mandatory xkcd</a>).</p><p>Hence, we can export our model to CoreML in 2 phases :</p><p>PyTorch → ONNX→ CoreML</p><p><strong>neural_style.py</strong> already have an ONNX export, so we really just need to implement the second step. We can find everything we need on the <a href="https://github.com/onnx/onnx-coreml">onnx-coreml Github repo</a> to bridge that gap.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/d280cf354db2f088b095eb4604464ef6/href">https://medium.com/media/d280cf354db2f088b095eb4604464ef6/href</a></iframe><p>Now that we have our ONNX to CoreML converter, we can convert a checkpoint with a simple bash command.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/268e95b19b408ceb378c8602eabee17e/href">https://medium.com/media/268e95b19b408ceb378c8602eabee17e/href</a></iframe><h3>Using a CoreML NST model on iOS</h3><p>You should feel more at home for this part if, like me, you are an iOS developer.</p><p>First, we start by creating a simple project with an <strong>UIImageView</strong> to hold our original and output image, as well as an <strong>UIButton</strong> to launch the process. <br>Next, we add an <strong>UIBarButtonItem</strong> to pick an image from the device’s photo library and set it onto the <strong>UIImageView</strong>.<br>Alternatively, we can just provide a default image and use that one.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*AfTqAedh6AO11lb2oFtpBQ.jpeg" /><figcaption>Main Storyboard</figcaption></figure><p>Once the UI is implemented, we can import our model into the app (download <a href="https://github.com/kirualex/NSTDemo/blob/master/NSTDemo/StarryNight.mlmodel">this pre-trained model</a> if you don’t have one). This is done by simply drag-and-dropping the file on Xcode’s files view. <br>By clicking on the model, you can see some important details about it.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*OVoQ6fUMu7wP0-IZxlC3zw.jpeg" /><figcaption>Xcode model file view</figcaption></figure><ul><li>Specifications (Name, type, Size, Author, Description and License)</li><li>Evaluation parameters (Inputs and Outputs with their respective names as well as expected types).</li></ul><p>You can also see a Model Class which was auto-generated by Xcode. You can click on the little arrow next to the Class name to jump in the actual file and see the API that will enable us to use our NST model. The function that we will call here is <strong>prediction(inputImage: CVPixelBuffer)</strong>.</p><p>Yep, it might be counter-intuitive but in CoreML the image type is <strong>CVPixelBuffer</strong>, not <strong>UIImage</strong>. Fortunately, Matthijs Hollemans has an <a href="http://github.com/hollance/CoreMLHelpers">awesome library</a> to help us make that conversion easily (and many more things).</p><p>So here are the steps once we have picked the original image:</p><ul><li>Save the input image size</li><li>Convert the input <strong>UIImage</strong> to a 720 x 720 <strong>CVPixelBuffer</strong>, as specified in the model view<strong>.</strong></li><li>Feed it to our Model</li><li>Convert the <strong>CVPixelBuffer</strong> output to an <strong>UIImage</strong></li><li>Resize it to the original size saved at the first step</li></ul><p>Here is a Swift 4 excerpt of the implementation of those steps.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/9dbefab01ee9f0ea6983b60b88304a53/href">https://medium.com/media/9dbefab01ee9f0ea6983b60b88304a53/href</a></iframe><p>We’re done! You will see that it doesn’t hurt to do all this on a background thread as it is pretty CPU intensive. It’s also a real memory hogger, so you might have to make some optimisations to get your code working on an actual device!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*uVbM1WZpjHjCHpOgQ5sF_A.jpeg" /><figcaption>Original and output image</figcaption></figure><p>If you wish to get your hands dirty, you can find the sources for <a href="https://github.com/kirualex/NSTDemo">this project on Github</a>.</p><p>I hope you have learned a few things about CoreML or even Machine Learning in this article. If you want to go further, <a href="https://heartbeat.fritz.ai/reducing-coreml2-model-size-by-4x-with-quantization-in-ios12-b1c854651c4">have a look at how I managed to reduce my model size by 4x using quantization for iOS12</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=76e00cd14b28" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Expanding UITableView cells using only constraints in Swift]]></title>
            <link>https://alexiscreuzot.medium.com/expanding-uitableview-cells-using-only-constraints-in-swift-f40b13206ea3?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/f40b13206ea3</guid>
            <category><![CDATA[ios]]></category>
            <category><![CDATA[swift]]></category>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Sun, 13 Nov 2016 15:20:42 GMT</pubDate>
            <atom:updated>2016-11-13T15:20:42.701Z</atom:updated>
            <content:encoded><![CDATA[<p>Here is a common requirement you may have stumbled upon a few times when creating an iOS app.</p><p>First, the result we want to achieve.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*6yZzLRRjT8D2DOonRPAmsQ.gif" /><figcaption>Final result</figcaption></figure><p>Let’s hop into Xcode and create a new “Single View Application” project. We’re going to begin by designing our interface. <br>A simple <em>UINavigationController</em> with our <em>ViewController</em> as its root will do.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*HyFoaSgvVSdxcDE5bjizjw.png" /><figcaption>Main.storyboard</figcaption></figure><p>Let’s add our <em>UITableView</em> and add an <em>UITableViewCell</em> to it. We need two labels, one that will be our title, and a second one below that will be our subtitle that will be either shown or hidden.</p><p>As we want for AutoLayout to compute the cell height itself, we need to setup the constraints accordingly.<br>Add a leading and trailing constraint to both labels. Then we want to add a top constraint between the title and its superview, a bottom constraint between the subtitle and its superview, and a vertical spacing constraint between both labels.<br>This will allow AutoLayout to understand that we want the superview to grow based on our labels intrinsic height. We’ll also set both label <em>numberOfLines</em> property to zero, so they don’t get truncated.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*VvLeusVeG-6pKKKTnmeN5A.png" /><figcaption>Constraints on our ExpandingTableViewCell</figcaption></figure><p>At that point AutoLayout may display an error, asking you to set a different vertical hugging priority to one of the labels. Seems logic, as for now both have the same priorities and AutoLayout need to know which one should take precedence. We can decrease the vertical hugging priority or set a fixed height constraint to either labels in order to get rid of this error.</p><p>Our UI seems good for now, let’s dive into the code! First, we need to create a new class for our prototyped <em>UITableViewCell</em>. We can call it <em>ExpandingTableViewCell</em> for the sake of originality. <br>Add both labels outlet and link them to their actual view in InterfaceBuilder.</p><p>Now, it’s a good idea to have a backing class for our cell. You can use a simple dictionary, but having typed properties is always a plus. To accurately represent our cell state we need a title, subtitle and a boolean to check wether the cell is expanded or not.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1000/1*AeWvFuejoI3bvtwzqN1k9w.png" /><figcaption>ExpandingTableViewCell.swift</figcaption></figure><p>Now that this is taken care of, onto our <em>ViewController</em>!</p><p>We want a property linked to our <em>UITableView</em>, and another for the array that will contain our cell representations. Let’s populate it with a few <em>ExpandingTableViewCellContent</em> instances to it so we can have something to display later.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*8UkLAxdlcG1VETlM71SGzA.png" /><figcaption>ViewController.swift</figcaption></figure><p>We still have to implement the datasource for the <em>UITableView</em>, and more importantly the delegate, where the magic happen!<br>In our delegate method <em>didSelectRowAtIndexPath</em>, we want 2 things : <br>- Change the <em>expanded</em> state of our backing class at that particular index<br>- Reload the cell to reflect its new state</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*yOqtlCnjoKdBNuLb41tmkA.png" /><figcaption>UITableViewDataSource and UITableViewDelegate implementation in ViewController.swift</figcaption></figure><p>Alright, everything seems in place, let’s build &amp; run (command + R)!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*5UH5Ky-JyINvJ5F4YjI6Hg.gif" /><figcaption>Nope</figcaption></figure><p>Whoops, that’s not what we want! Well, we miss one last detail in our <em>ViewController</em> <em>viewDidLoad</em> to make this a home run. We need to explicitly specify to our <em>UITableView</em> that AutoLayout should take care of our cell heights. This is done in 2 lines :</p><blockquote>tableView.estimatedRowHeight = 60<br>tableView.rowHeight = UITableViewAutomaticDimension</blockquote><p>This time the code should run smoothly and behave correctly. Enjoy!</p><p>Hope you liked this tutorial, you can find its <a href="https://github.com/kirualex/ExpandingTableViewCell">source-code on github</a>. Don’t hesitate to like/comment if you found it useful!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=f40b13206ea3" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[How Chartboost empowered me and shut down my account]]></title>
            <link>https://alexiscreuzot.medium.com/how-charboost-empowered-me-and-shut-down-my-account-8a33b2a46fe8?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/8a33b2a46fe8</guid>
            <category><![CDATA[mobile]]></category>
            <category><![CDATA[mobile-advertising]]></category>
            <category><![CDATA[advertising]]></category>
            <category><![CDATA[mobile-app-development]]></category>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Tue, 20 Sep 2016 20:35:54 GMT</pubDate>
            <atom:updated>2016-09-26T22:21:41.344Z</atom:updated>
            <content:encoded><![CDATA[<p><strong>EDIT</strong> : Chartboost just reached back to me and unsuspended my account. I’ll have more information in a few days and will update this article accordingly, hang tight!</p><p><strong>EDIT2</strong> : After a call with someone from the Chartboost team, I got an apology, 300$ in advertising credit and — most importantly — some context on why my account got suspended in the first place. <br>Apparently my email was flagged in their system for some reason. As it uses machine learning for fraud detection, the user behaviour variation in combination with my flagged email triggered a false positive.<br>They understand that automating the account suspension isn’t the best developer-friendly approach and are apparently working on improving this. Let’s hope they do!</p><p>For transparency, I decided to keep the following article as is so you can form your own opinion.</p><h3>Some background</h3><p>A few months ago, I decided to entertain myself on a tiny side-project. With the Pokemon GO craze at the time, the idea was an obvious — and not very original — one : a buddy app for Pokemon GO.</p><p>A few days and cups of coffee later, <a href="https://itunes.apple.com/us/app/mapvision-for-pokemon-go/id1139412364?mt=8">MapVision</a> was born : a simple map with pokemons popping in and out.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*gPXpX3A-xrvz4iZzfu1zFA.png" /><figcaption>MapVision icon</figcaption></figure><p>Despite a (still) spotty API, the app got immediately some traction.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*E0Q99hKLwv0z8VsLKjAjWg.png" /></figure><p>This encouraging curve brought with it the possibility to test some monetization techniques and compare how Video Ads fared compared to IAP.</p><p>I threw a few more days in the project for the killer feature. A “radar” that would notify the user of nearby pokemons from a list of “wanted” ones. <br>Monetization-wise : one video view unlocks 1 more pokemon to get notified about. The IAP unlocks them all. Simple enough.</p><h3>Chartboost comes in</h3><p>I decided to go with Chartboost for Video Ads mostly by laziness. I knew that is was used on some iOS games, and their SDK seemed simple enough to implement. Plus they had a “rewarded video” ad type which was exactly what I was looking for : watch one, get one. Perfect.</p><p>Despite a slow start, the video ads started to pick up a little after a few day. Something around 1$ per 250–300 views. Still, it seemed very low for the hassle it put the user through, but I had something working so I decided to let it as-is and let it go its course.</p><p>Then 2 weeks later, out of the blue :</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*CGWG2mepf8m2X0U4Di1hFQ.png" /></figure><p>I was fucking pissed. No warning, no real contact to call and on top of that, no access to the Chartboost dashboard to review what went wrong.</p><p>Fortunately, I had put a custom event on Fabric to monitor successful Rewarded Video views.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*2izGUOfV0LpPQGWus5EIfA.png" /></figure><p>After reviewing the small dent in the graph, it appeared that one or 2 users had gone rogue and decided to view loads of videos instead of actually getting the IAP. Well, those people exists, and isn’t that precisely why we use ads in apps?<br>Moreover, I had put a limit on videos watchable <em>from the Chartboost dashboard itself, </em>the very one I had no longer access to.</p><p>Prompting my disbelief in response did not phase them, and I received a few hours later a very formal, very polite mail closing the issue — unilaterally — for good.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*oWDEZxu-0kQSCKPM3FpOTA.png" /></figure><h3>Epilogue?</h3><p>My last attempt at contacting Chartboost Support afterwards stayed unanswered, and I’m left with a bitter taste in my mouth.</p><p>Not because I won’t see a dime of those 14 days of ads, or the hours I put into implementing this feature.</p><p>No, I’m sick to my gut to see how a company that profess being “<a href="https://www.chartboost.com/#features">on a mission to empower developers</a>” can care so little that it won’t engage in a real chat or even provide any data to help remediate the issue…</p><p>One thing for sure, being empowered never left me feeling that powerless.</p><p>I am actually curious if anyone have had similar experiences with advertisers, Web or Mobile. I you did, don’t hesitate to share your experience too!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=8a33b2a46fe8" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Développement : comment j’ai appris à ne plus m’en faire et à aimer la régie]]></title>
            <link>https://alexiscreuzot.medium.com/d%C3%A9veloppement-comment-j-ai-appris-%C3%A0-ne-plus-m-en-faire-et-%C3%A0-aimer-la-r%C3%A9gie-bd20fec1f342?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/bd20fec1f342</guid>
            <category><![CDATA[information-technology]]></category>
            <category><![CDATA[freelancing]]></category>
            <category><![CDATA[entrepreneurship]]></category>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Wed, 24 Feb 2016 13:30:55 GMT</pubDate>
            <atom:updated>2018-12-04T22:18:40.089Z</atom:updated>
            <content:encoded><![CDATA[<p>Depuis mes débuts dans le monde informatique, j’ai eu l’occasion de suivre et réaliser de nombreux projets. Ceux-ci étaient en grande majorité au forfait.</p><h3>Le forfait</h3><p>Le système est ainsi fait : les clients expriment un besoin — généralement sous la forme d’un appel d’offres — auquel les entreprises compétentes répondent avec un devis.</p><p>Pour ce faire, elles font suivre le cahier des charge au personnel technique pour obtenir une estimation du temps. Cette estimation est “challengée” par l’équipe commerciale afin de la diminuer au maximum, car évidemment il ne faut pas oublier que plusieurs entreprises sont susceptibles d’être en concurrence pour décrocher le contrat. Un savant calcul est alors effectué en prenant des critères comme le coût journalier, le nombre de personnes affectées au projet et différentes marges de sécurité pour obtenir le tarif final. Ce processus est souvent vecteur de stress pour les personnes impliqués. Pourtant, le travail n’a pas encore commencé…</p><p>Estimer le temps de réalisation d’un projet informatique est loin d’être une science exacte. Sa précision est inversement proportionnelle au nombre de fonctionnalités. En clair, plus la tâche est grande, plus l’incertitude augmente.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*inmi8aLWFTx20DG-d7MyfA.jpeg" /><figcaption>L’incertitude augmentant avec la complexité</figcaption></figure><p>Un vrai casse-tête pour les entreprises qui n’ont alors d’autre choix que d’ajouter une marge de sécurité afin de limiter les risques.</p><p>Au final, quelque soit le prestataire sélectionné par le client, le résultat est par forcément l’un des suivants :</p><ul><li>Le prestataire a sous-estimé la charge et finit le projet à perte, auquel cas il devra souvent bâcler la fin du travail pour limiter la casse.</li><li>L’estimation est tombée juste — ce qui est rare— et le projet peut donc être considéré comme un succès.</li><li>Le projet est fini en avance, le prestataire augmente donc sa marge. Cependant le client aurait probablement bien aimé utiliser celle-ci pour d’autres éléments de son business plan.</li></ul><p>Indépendamment de ces résultats, il ne faut pas oublier que “challenger” l’équipe technique en début de projet ne se fait pas sans prendre des raccourcis. Ces mêmes raccourcis se transforment immanquablement en <a href="https://fr.wikipedia.org/wiki/Dette_technique">dette technique</a>, au détriment du client et de l’éventuel prestataire qui reprendra le projet par la suite.</p><p>Le client est donc bien souvent perdant en récupérant un projet fait à la va-vite, potentiellement bâclé et/ou surpayé.</p><h3>La régie</h3><p>La régie consiste tout simplement à facturer au temps passé. Avec ce format, le prestataire peut toujours fournir une estimation au client, mais celle-ci reste informelle.</p><p>Fonctionner ainsi présente plusieurs avantages.</p><ul><li>Côté technique, le prestataire n’a aucun intérêt à prendre des raccourcis ou à bâcler le travail.</li><li>Si le projet se termine plus rapidement que prévu, le client comme le prestataire rentrent dans leurs frais.</li><li>Dans le cas contraire, le prestataire est plus facilement capable d’expliquer les raisons du retard, chose qui ne serait pas possible en avance de phase.</li></ul><p>Avec ce système, le client est toujours gagnant. Il a par-ailleurs la possibilité d’arrêter le projet à tout moment si le prestataire ne répond plus à ses attentes. <br>Au niveau du prestataire, plus de possibilité de “marge bonus”, mais cependant la garantie de toujours rentrer dans ses frais.</p><p>C’est du gagnant-gagnant avec — cerise sur le gâteau — moins de stress à la clé.</p><h3>La stratégie de l’échec</h3><p>Alors pourquoi la grande majorité des projet informatiques sont-ils toujours effectués au forfait?</p><p>Il y a plusieurs raisons :</p><ul><li>Les appels d’offres restent pour les entreprises un moyen — bien que virtuel — <a href="http://www.commitstrip.com/fr/2017/01/09/that-little-problem-with-agile/">de se projeter sur les coûts</a>.</li><li>Le manque de confiance et la peur que le prestataire fasse “traîner” le projet aux frais du client.</li><li>Le besoin aussi d’une communication plus régulière dans le cadre de la régie, qui n’est pas toujours réalisable.</li></ul><p>En ce qui me concerne, je ne regrette pas le choix de m’être tourné presque exclusivement vers du temps passé depuis que j’ai lancé mon activité. <br>Cela m’a permis d’aborder l’entrepreneuriat avec moins d’insécurités, mais aussi et surtout de créer des relations de confiance (et donc de travail) durable avec mes clients.</p><p>Si vous souhaitez vous aussi faire la transition, n’hésitez pas à relayer cet article. Le meilleur choix est toujours celui fait en toute connaissance de cause !</p><p><em>Alexis Creuzot est développeur et co-fondateur de </em><a href="http://monoqle.fr"><em>Monoqle</em></a><em>, une entreprise spécialisée en développement mobile.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=bd20fec1f342" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Dans cet article je souhaite parler de la création de l’association de coworking Happy Hours auquel…]]></title>
            <link>https://alexiscreuzot.medium.com/dans-cet-article-je-souhaite-parler-de-la-cr%C3%A9ation-de-l-association-de-coworking-happy-hours-auquel-83ddc6a0655?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/83ddc6a0655</guid>
            <category><![CDATA[startup]]></category>
            <category><![CDATA[francai]]></category>
            <category><![CDATA[coworking]]></category>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Fri, 29 Jan 2016 14:01:35 GMT</pubDate>
            <atom:updated>2016-06-05T09:04:46.181Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*e0Jbyk0J8XyA3aLO.jpg" /><figcaption>Happy Hours Coworking</figcaption></figure><p>Dans cet article je souhaite parler de la création de l’association de coworking <a href="http://happyh0urs.com/">Happy Hours</a> auquel j’ai pris part et qui a pris une bonne partie de mon temps de fin 2014 à début 2015.</p><p>Le projet Happy Hours est le fruit d’une problématique commune à <a href="http://gweno.net/">Gwénolé</a>, <a href="http://jeremypaul.me/">Jérémy</a>, <a href="http://matthieu-schneider.fr/">Matthieu</a> et moi même.</p><blockquote><em>Comment travailler agréablement et éviter l’isolement lorsque l’on est indépendant?</em></blockquote><h3>L’existant</h3><p>A l’origine, deux solutions se présentaient pour ne pas être seul chez soi.</p><h4>La location de bureau auprès d’une société spécialisée</h4><p>Celle-ci sous entend un prix prohibitif, notamment pour les entreprises qui démarrent et ne garanti absolument pas une ambiance décontractée telle que nous recherchions.<br>Leurs offres de coworking se sont d’ailleurs souvent faites “sur la vague” en créant un open space mais sans aller beaucoup plus loin.</p><h4>Utiliser un coworking “sponsorisé” par la municipalité</h4><p>Après avoir testé la solution proposée par Rennes Métropole (la cantine numérique et l’Annexe), le retour était mi-figue mi-raisin.</p><p>Bien que l’atmosphère y soit détendue, les locaux, le manque d’indépendance vis-à-vis de la municipalité et l’objectif assez marqué par celle-ci de “booster” les entreprises y entrant ne correspondaient pas à nos attentes.</p><h3>Le concept</h3><p>Après en avoir discuté ensemble, nous avons trouvé un concept sur lequel nous étions d’accord. Celui-ci comportait plusieurs priorités.</p><ul><li>Des locaux agréables, faciles d’accès et si possible proches du centre.</li><li>Des bureaux de qualité — on y passe après tout 8 heures par jour — et spacieux.</li><li>Un effectif limité privilégiant les résidents au nomades afin de faciliter la cohésion entre les membres.</li><li>Une indépendance nous permettant de gérer l’espace, son aménagement ainsi que l’animation.</li><li>Et bien sur être dépourvu de toute étiquette “accélérateur”, “booster” ou encore “pépinière d’entreprise”. Juste un endroit sympa ou il fait bon vivre et — presque accessoirement — travailler.</li></ul><p>Le nom évocateur d’Happy Hours y prend tout son sens!</p><h3>La démarche</h3><p>La création d’une association n’est pas simple, et la charge administrative peut faire peur. Heureusement, à 4, on peut diviser le travail et s’entre-motiver pour avancer.</p><h4>L’administratif</h4><p>La première étape, consistant à déclarer l’association, etait une étape peu attrayante, se divisant elle même en une kyrielle de <a href="http://vosdroits.service-public.fr/associations/F3109.xhtml">tâches décrites ici</a>.</p><p>Nous avons cependant eu l’occasion d’être conseillés par plusieurs personnes, en particulier <a href="http://www.nicolas-birckel.fr/">Nicolas Birckel</a> ayant lui même créé le coworking de Nancy, <a href="http://www.poudriere.org/">la Poudrière</a>. Encore merci!</p><h4>Le financier</h4><p>La seconde étape était la création du capital de départ servant à financer les premières charges auxquelles l’association ferait face avant de devenir auto-suffisante.</p><p>Passage à la banque obligé donc, et qui dit banque dit prévisionnel et autres informations personnelles de solvabilité. Cela dit aussi trou dans les finances personnelles, mais on a rien sans rien!</p><h4>Les locaux</h4><p>La dernière étape — la plus difficile, mais aussi la plus importante — consistait à trouver les fameux locaux qui formerait le coeur du projet.</p><p>Il nous a fallut de nombreuses visites, contre-visites, déceptions et frustrations avant de trouver la perle rare. Les fameux locaux 22 quai Duguay-Trouin où se situe l’espace de coworking Happy Hours.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*dQm3P2jYetMOMSCj.jpg" /><figcaption><em>La magnifique façade du 22 Quai</em></figcaption></figure><p>Le style “pas trop bureaux”, l’emplacement et l’espace nous ont rapidement convaincu.</p><h3>La mise en place</h3><h4>La planification</h4><p>Avant même d’avoir les clés, Jérémy s’est mis au travail pour modéliser les locaux en 3D, ce qui nous a permis de tester plusieurs dispositions, comme il a pu le décrire plus en détail sur <a href="http://jeremypaul.me/project/happyhours.html">cet article</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*m8ivW84EUSl_5bGD.jpg" /><figcaption>Représentation 3D des locaux</figcaption></figure><h4>L’aménagement</h4><p>Une fois les clés en poche, on est passé à la partie “travaux manuels”. Nous avons décidé de construire les bureaux nous même afin d’avoir une qualité maximale pour un prix restant dans nos moyens.</p><p>Les autres meubles ont été achetés avec le budget restant ou grâcieusement offert par les membres de l’associations!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/660/0*stNcD23YU9nAIwTR.png" /><figcaption><em>Notre spécialiste montage au travail</em></figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/658/0*KBD5cIBqO9k2ra32.png" /><figcaption><em>Le passage du vernis après le ponçage. Répétez 3 fois!</em></figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/590/0*OaC-Bzxr56j4jMtz.png" /><figcaption>Le vissage des supports de pieds</figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*iKNHh-WoH1ZDkCTb.jpg" /><figcaption><em>Après quelques montages et vissages supplémentaires, un résultat très satisfaisant</em></figcaption></figure><h3>Le lancement</h3><h4>Le site</h4><p>Jérémy, Gwénolé et Matthieu ont mis leur savoir-faire en commun pour mettre au point le <a href="http://happyh0urs.com/">site web d’Happy Hours</a>. Y figurent des informations sur l’association, les locaux et les coworkers.</p><h4>L’installation</h4><p>Nous nous sommes officiellement installés dans les locaux le 5 janvier 2015. Le coworking s’est progressivement rempli à partir de cette date et nous avons dorénavant clos les inscriptions pour les places de résidents.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*6vkGWMfWHv-qQA0j.jpg" /><figcaption><em>Des bureaux où il fait bon travailler</em></figcaption></figure><h3>Retour d’expérience</h3><p>Je ne suis pas trop entré dans les détails car certains points mériteraient un article à eux seuls. J’espère avoir cependant bien résumé cette aventure — il faut le dire — qu’aura été la création de d’Happy Hours.</p><p>Pour ma part, palper le bois des bureaux m’apporte aujourd’hui une réelle satisfaction, la fierté d’être aller au bout des choses. Un sentiment d’accomplissement que j’espère partagé par mes associés, sans qui rien n’aurait été possible.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=83ddc6a0655" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[iOS : Let’s Build a Network Abstraction Layer]]></title>
            <link>https://medium.com/learning-swift/ios-let-s-build-a-network-abstraction-layer-6133ae60d143?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/6133ae60d143</guid>
            <category><![CDATA[development]]></category>
            <category><![CDATA[ios]]></category>
            <category><![CDATA[swift]]></category>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Wed, 23 Dec 2015 22:17:36 GMT</pubDate>
            <atom:updated>2015-12-30T16:18:12.087Z</atom:updated>
            <content:encoded><![CDATA[<p>Swift implementation on top of Alamofire</p><p>A network abstraction layer is a must have in any app that interact with a server. In Objective-C, this abstraction layer is usually done in two steps :</p><ul><li>First, subclass the main networking component from our library, usually <em>AFHTTPSessionManager</em> when using <a href="https://github.com/AFNetworking/AFNetworking">AFNetworking</a>.</li><li>Second, create categories on our models — or services of some kind — to put in our networking stuff. All this wrapped in convenient, explicit methods of course.</li></ul><p>It’s not too bad, but here comes Swift and we now get a chance to look at this layer from a new angle.</p><p>Our goals are:</p><ul><li><strong>Flexibility</strong>, to be able to edit or add new endpoints efficiently</li><li><strong>Readability</strong>, to have a good idea of how our API work at a glance</li><li><strong>Code safety</strong>, with typed parameters. This allow all the pre-compilation goodness we expect from Xcode (completion, validation).</li><li><strong>Easy debugging</strong>, meaning being able to insert logs before and after web requests</li></ul><h3>Endpoints</h3><p>Let’s imagine we want our app to connect to a web-service allowing us to both fetch and add objects for a given model <em>Color</em>. Our first task is of course to create this<em> </em>model in our app. Once this taken care of, we can start thinking about how we want to interact with this API.</p><p>With Swift, <a href="https://developer.apple.com/library/ios/documentation/Swift/Conceptual/Swift_Programming_Language/Enumerations.html#//apple_ref/doc/uid/TP40014097-CH12-ID145">enumerations became more powerful than ever</a>. They are a great way to define our endpoints in a readable and type-safe way. Moreover, each <em>enum case</em> can declare associated values of any type to be stored with it.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/8d79ff2c33c4f381eee9b051da46b65a/href">https://medium.com/media/8d79ff2c33c4f381eee9b051da46b65a/href</a></iframe><p>For each of those <em>Endpoints</em>, we can provide a clear switch-based implementation for the method, path and parameters we’ll want to use.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/16729119e01fadd9d7c22104e46e6a77/href">https://medium.com/media/16729119e01fadd9d7c22104e46e6a77/href</a></iframe><p>As you can see, associated values are ideal to wrap every informations needed for our request into one <em>Endpoints enum case</em>.</p><h3>Request</h3><p>With a proper representation of our endpoints, we can now create our own request function on top of <a href="https://github.com/Alamofire/Alamofire">Alamofire</a>. It will only need an <em>Endpoints enum case</em><strong> </strong>and a completion handler to rock and roll!<br>This function is also the perfect place to put any operations we might want to be executed before and after the request took place. Logging for instance (here I’m using Cocoalumberjack, don’t mind the syntax).</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/ee1f4e22820555c86b33ab200a6073e9/href">https://medium.com/media/ee1f4e22820555c86b33ab200a6073e9/href</a></iframe><h3>Result</h3><p>After wrapping all that previous code into a publicly accessible class <em>API</em>, we can call our server like so:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/b9090e49cd1d2fb81ffdff7693579d40/href">https://medium.com/media/b9090e49cd1d2fb81ffdff7693579d40/href</a></iframe><p>Pretty sweet eh? Readable, type-safe and with only one file to reflect any changes in our API. Mission accomplished!</p><p>A project named <a href="https://github.com/Moya/Moya">Moya</a> took the same direction using enums to abstract the network layer. A great library, though I feel Swift makes it actually pretty easy and brief to build it yourself.</p><p>You can find a working example of this approach on my <a href="https://github.com/kirualex/ColourLoveSwift">Github repo ColourLoveSwift</a>. <br>Ideas, remarks? Don’t hesitate to comment or send me a message on <a href="https://twitter.com/alexiscreuzot">twitter</a>!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6133ae60d143" width="1" height="1" alt=""><hr><p><a href="https://medium.com/learning-swift/ios-let-s-build-a-network-abstraction-layer-6133ae60d143">iOS : Let’s Build a Network Abstraction Layer</a> was originally published in <a href="https://medium.com/learning-swift">Learning Swift</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[How I almost killed my most successful app]]></title>
            <link>https://alexiscreuzot.medium.com/how-i-almost-killed-my-most-successful-app-9dbbd5a2144c?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/9dbbd5a2144c</guid>
            <category><![CDATA[ios]]></category>
            <category><![CDATA[apple]]></category>
            <category><![CDATA[iphone]]></category>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Thu, 28 May 2015 12:00:53 GMT</pubDate>
            <atom:updated>2016-03-30T15:47:54.118Z</atom:updated>
            <content:encoded><![CDATA[<p><em>Here’s the story of my fifteen days of shame and frustration that nearly made my most successful app bite the dust.</em></p><p>I developed Nice Weather back in 2013. The concept was simple and I had no expectations. A weather app focused on simplicity with an interactive graph, packaged with a clean UI.<br>Its launch passed pretty much unnoticed, but nevertheless I kept on improving it little by little. Side projects are fun after all.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/1*qSykWL6HtJDRWq91PZEplw.png" /><figcaption>Nice Weather, first of the name</figcaption></figure><p>Several months later, I decided to redesign the app and — not wanting to force it onto existing users — branched the project as a new app : <strong>Nice Weather 2</strong>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/1*-UkIkd-epPF2jQ80sCEdGA.jpeg" /><figcaption>Nice Weather 2</figcaption></figure><p>I got more traction. Then, out of the blue, an angel smiled upon me. Apple had noticed the app and decided to feature it on the App Store.<br>The downloads skyrocketed and it became quickly my number 1 selling app.</p><p>The next few weeks were pure craziness.</p><p>After a while, things gently settled down; Nice Weather 2 had made its place into the Weather section of the App Store, ensuring stable download rates. I kept on improving it and getting feedback from overall delighted users.</p><p>After one year of mostly small updates, I decided last April to make a huge upgrade of the app. It involved trashing and recoding some parts from the ground up.</p><p>One of the biggest part I wanted to improve was to store data more efficiently and reliably. If you already worked on databases, you know how painful and risky it is to make their structure evolve. To make sure you stay compatible with previous data models, you need to create what we call “migrations”, which essentially take the data from the old structure and process it so it “fits” in the new one.</p><p>The main issue with migrations is that you need to think of every form the data may have taken to process it accordingly.</p><p>The big update rolled out and I saw almost immediately the crash-free sessions drop from the usual 99.5% to 94%. Something was wrong and I jumped to my computer to solve it.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/866/1*B3CvwuK1nmEPNu-ZK1Yqjg.png" /><figcaption>Impact of May 7 update</figcaption></figure><p>I took the opportunity to improve some of the data management and submitted a new update with an expedited review request, which was accepted swiftly.</p><p>The new-new update was then released, and it did not take long for me to realize that something was askew. Dozens of e-mails either from users reporting crash or from Fabrics, the tool I use to track crashes. The drop was harsh.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*03ZcPr2SxGBPiKeNPARqzA.png" /><figcaption>Catastrophic drop after May 11 update</figcaption></figure><p>From 95% crash-free sessions I dropped to 40%. In my hurry I had forgotten a migration while submitting the fix, and essentially every existing user of the app was crashing on startup after this new update.</p><p>I jumped again to my screen and made sure that the migration was this time taken care of, testing heavily on previous versions to ensure that there would be no more mishap. Submission. Expedited review. Release.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*fY2smdIT0oif0jaUyiHX9g.png" /><figcaption>Improvement after May 18 update</figcaption></figure><p>Fuck.</p><p>Crash-free sessions stopped dropping, but passed a point they stabilized around 60%. What in the hell did I do wrong.</p><p>It took me a week to figure it out. I had forgotten the unlucky users who had crashed in the previous version, for which the data was stuck “in-between”, in some kind of limbo state.</p><p>So, here we go again. Submission, but no expedited review this time, as after two times in such a short timeframe, there was no point in even trying.<br>The waiting was excruciating, as while I was lingering for the update to pass, I was witnessing first-hand users tearing the app apart in their reviews and, consequently, all the indicators slowly go red.</p><p>It took 10 excruciatingly long days for the app to be reviewed and, at last, released.</p><p>As of now, Nice Weather 2 has returned to a more manageable crash-rate, hovering nicely around 1%. If anything, this adventure taught me two things :</p><ul><li>Never submit an update in a hurry, whatever the issue, give it at least a good night’s sleep</li><li>Use a good crash reporting tool. Needless to say, this story would have ended way worse without Fabric allowing me to peek behind the curtain</li></ul><p>Let it be a valuable lesson for all iOS developers out there !</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=9dbbd5a2144c" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[7 Awesome Pods For Your iOS Projects]]></title>
            <link>https://alexiscreuzot.medium.com/7-awesome-pods-for-your-ios-projects-537d82973efc?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/537d82973efc</guid>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Thu, 30 Apr 2015 13:21:45 GMT</pubDate>
            <atom:updated>2015-05-27T22:11:21.933Z</atom:updated>
            <content:encoded><![CDATA[<p><em>In case you never heard of it, CocoaPods is a dependency manager for Swift and Objective-C Cocoa projects. It has thousands of libraries that can be added in a glimpse to your project.<br>In this article, I present you some of the best libraries CocoaPods has to offer and that I frankly can’t live without.</em></p><h3><a href="https://github.com/supermarin/ObjectiveSugar">pod ‘ObjectiveSugar’</a></h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/572/1*vwCE-wBrTePeHwmAxYelGg.png" /></figure><p>ObjectiveSugar brings a set of functional additions for Foundation you wish you’d had in the first place.</p><p>Most of those additions are brought in the forms of Categories that extend existing classes of the Foundation framework. In particular, you’ll find some of the familiar functions that make your life so easier and come out of the box in Ruby.</p><p>This Pod will definitely help you write more readable and concise code, especially when your algorithms tend to get a little “spaghetti”.</p><h3><a href="https://github.com/CocoaLumberjack/CocoaLumberjack">pod ‘CocoaLumberjack’</a></h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/250/1*HHtCx60uFHepb9fapwRSJg.png" /></figure><p>If you rely a lot on logging to follow your app behavior, CocoaLumberjack will quickly become your best friend. You can define log levels, send them through the network or even save them locally. And it’s faster than NSLog.</p><p>You can also display logs in different colors by using the <a href="https://github.com/CocoaLumberjack/CocoaLumberjack/blob/master/Documentation/XcodeColors.md">XcodeColors plugin for Xcode</a>. A must have!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/908/1*oepjUMETpwBGgSIvAO9--A.png" /><figcaption>CocoaLumberjack with XcodeColors</figcaption></figure><h3><a href="https://github.com/AFNetworking/AFNetworking">pod ‘AFNetworking’</a></h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/605/1*3uMMubRXuopW6-dEcO0ttA.png" /></figure><p>It’s quite likely you have already heard or even used this library, but I couldn’t make this list without including it.</p><p>AFNetworking is simply the reference to manage network requests to an API. It is simple, fast and supported by a huge community of developers.</p><h3><a href="https://github.com/realm/realm-cocoa">pod ‘Realm’</a></h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/196/1*xcS2e6HYC8906MUabESzNg.png" /></figure><p>In my 3 years of development I’ve searched extensively for a satisfying database framework. I tried FMDB, CoreData, MagicalRecord and used iActiveRecord for a while.<br>Each time I got frustrated. Some are too low-level and need a lot of boilerplate code, some don’t play well with threads and some lack in performances as soon as your queries get a little complicated .</p><p>For now Realm seems to have achieved the perfect mix. It allows ultra-clean and readable code (think Active Record pattern) and manage to be very, very fast.</p><p>I cannot recommend enough this library if you need to have a client-side database.</p><h3><a href="https://github.com/BradLarson/GPUImage">pod ‘GPUImage’</a></h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/238/1*cMRYMn2XcGywoF6sg1Rcag.png" /></figure><p>If you need to process images or video in your app and don’t want to spend weeks cramming research papers and other complex documentation, this library is made for you !</p><p>With an impressive collection of filters and image processing algorithms, you can achieve blurs, blend images, detect edges and much more in just a few lines of codes.</p><p>It is also very well thought and allow easy chaining between different filters and transformations while staying performant.</p><h3><a href="https://github.com/TransitApp/SVProgressHUD">pod ‘SVProgressHUD’</a></h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/331/1*E77lbBtrgCwtHMibKC9RDA.png" /></figure><p>You are probably familiar with the image above, as you can now find this library in many iOS app.</p><p>In general, you don’t want your user to wait, and you should preload as much stuff as you can for a smooth experience.<br>But when you can’t, SVProgressHUD does a fine job at indicating a loading, in one line of code. You can also use it to display “toasts” to the user and even customize the image and text.</p><p>Too bad it doesn’t provide a determinate progress indicator, for which case you may want to use the equally good <a href="https://github.com/jdg/MBProgressHUD">MBProgressHUD</a> library.</p><h3><a href="https://github.com/PrideChung/FontAwesomeKit">pod ‘FontAwesomeKit’</a></h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/737/1*_tjkS6qSXRe1C2OC3sV2CA.png" /></figure><p>Already well known in the web world, FontAwesome system is also available on iOS and it rocks !</p><p>If you are getting fed up of having to create @2x and @3x versions for each and every icon in your app, FontAwesomeKit will be a life savior. It basically allows you to use icons like you would use letters from a font. That means that every icons are vectorized and can take any color you want!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/676/1*lFOSq9uh6b6IbK6qT8A-OA.png" /><figcaption>Sample icons from IonIcons</figcaption></figure><p>You can already use a variety of available icons from multiple iconic fonts like IonIcons or Zocial, but you can also create your own font and use it by extending the <strong>FAKIcon </strong>class.</p><p><em>That’s it! I hope you liked this article and that some of those libraries will find their way in your Podfile ☺</em></p><p><strong>And you, what are your favorite Pods ?</strong></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=537d82973efc" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Cocoapods with Swift]]></title>
            <link>https://alexiscreuzot.medium.com/cocoapods-with-swift-e6f8ba8f0afc?source=rss-57dd0a46a136------2</link>
            <guid isPermaLink="false">https://medium.com/p/e6f8ba8f0afc</guid>
            <dc:creator><![CDATA[Alexis Creuzot]]></dc:creator>
            <pubDate>Wed, 04 Jun 2014 08:39:13 GMT</pubDate>
            <atom:updated>2016-03-04T15:25:31.206Z</atom:updated>
            <content:encoded><![CDATA[<h4>Support Objective-C pods in a Swift project</h4><p><em>UPDATE : This article is now deprecated, you can import any pods very easily in Swift by just adding </em><strong>use_frameworks!</strong> <em>in your Podfile !</em></p><p>First, create your Podfile and add the pods you need as usual.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/1*ZJgYyIQFmFcAmH1SRwOxZg.png" /><figcaption>Podfile</figcaption></figure><p>Install them using the <em>pod install </em>command and open the .xcworkspace file created in your project folder. Pods should now be included in your workspace.</p><p>Now for the interesting part. In order to use those pods, you are going to create a <strong>bridging header file</strong>. Click on File -&gt; New -&gt; File… and select “Header File” in the “Source” tab.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/708/1*_2YVqX3WjNF-iGnVr7uE5w.png" /></figure><p>Name this file <strong>Bridging.h.</strong></p><p>Open your project Build Settings and search for “Bridging”. Edit the key “Objective-C Bridging Header” to <strong>project_name/project_name-Bridging-Header.h.</strong></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/675/1*lLIjbREKBw_ZI6FBeyrAvA.png" /><figcaption>Here my project is named “SwiftBasics”</figcaption></figure><p>You are now ready to add your imports into your Bridging-Header.h file for the pods you want to use, just as you would do in your .pch file.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/1*6SQR9EaYr2bL0FiunLZulg.png" /><figcaption>Content of SwiftBasics-Bringing-Header.h</figcaption></figure><p>That’s it, you can now use your pods*. You don’t even have to import them again in your .swift files. You can also import any other Objective-C file added to your workspace this way. Sweet !</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/1*GydI5K0La0bzlOp9AfXVhg.png" /><figcaption>Using SVProgressHUD to show a message</figcaption></figure><p>I hope you liked this article, I’ll post more tips for Swift as I dive deeper into Swift. You can also find my some of my favorite Pods on this article :</p><p><a href="https://medium.com/@kirualex/7-awesome-pods-for-your-ios-projects-537d82973efc">7 Awesome Pods For Your iOS Projects</a></p><p>*<em>Warning, some specific pods may not work or need additional configuration.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=e6f8ba8f0afc" width="1" height="1" alt="">]]></content:encoded>
        </item>
    </channel>
</rss>