+
+Behind every great scientist are great repeatable methods. Sadly, this is easier said than done.
+
+When talented scientists, engineers, or developers work on their own, a mess may be unavoidable. Yet, it may still be
+manageable. However, with time and more people joining your project,
+managing the clutter takes its toll on productivity.
+As your project moves toward production,
+visibility and provenance for scaling your deep-learning efforts are a must, but both
+suffer as your team grows.
+
+For teams or entire companies, TRAINS logs everything in one central server and takes on the responsibilities for visibility and provenance
+so productivity does not suffer.
+TRAINS records and manages various deep learning research workloads and does so with unbelievably small integration costs.
+
+TRAINS is an auto-magical experiment manager that you can use productively with minimal integration and while
+preserving your existing methods and practices. Use it on a daily basis to boost collaboration and visibility,
+or use it to automatically collect your experimentation logs, outputs, and data to one centralized server for provenance.
+
+
+
+## Why Should I Use TRAINS?
+
+TRAINS is our solution to a problem we share with countless other researchers and developers in the
+machine learning/deep learning universe.
+Training production-grade deep learning models is a glorious but messy process.
+We built TRAINS to solve that problem. TRAINS tracks and controls the process by associating code version control, research projects, performance metrics, and model provenance.
+TRAINS removes the mess but leaves the glory.
+
+
+Choose TRAINS because...
+
+* Sharing experiments with the team is difficult and gets even more difficult further up the chain.
+* Like all of us, you lost a model and are left with no repeatable process.
+* You setup up a central location for TensorBoard and it exploded with a gazillion experiments.
+* You accidentally threw away important results while trying to manually clean up the clutter.
+* You do not associate the train code commit with the model or TensorBoard logs.
+* You are storing model parameters in the checkpoint filename.
+* You cannot find any other tool for comparing results, hyper-parameters and code commits.
+* TRAINS requires **only two-lines of code** for full integration.
+* TRAINS is **free**.
+
+## Main Features
+
+* Seamless integration with leading frameworks, including: PyTorch, TensorFlow, Keras, and others coming soon!
+* Track everything with two lines of code.
+* Model logging that automatically associates models with code and the parameters used to train them, including initial weights logging.
+* Multi-user process tracking and collaboration.
+* Management capabilities including project management, filter-by-metric, and detailed experiment comparison.
+* Centralized server for aggregating logs, records, and general bookkeeping.
+* Automatically create a copy of models on centralized storage (TRAINS supports shared folders, S3, GS, and Azure is coming soon!).
+* Support for Jupyter notebook (see the [trains-jupyter-plugin]()) and PyCharm remote debugging (see the [trains-pycharm-plugin]()).
+* A field-tested, feature-rich SDK for your on-the-fly customization needs.
+
+
+## TRAINS Magically Logs
+
+TRAINS magically logs the following:
+
+* Git repository, branch and commit id
+* Hyper-parameters, including:
+ * ArgParser for command line parameters with currently used values
+ * Tensorflow Defines (absl-py)
+ * Manually passed parameter dictionary
+* Initial model weights file
+* Model snapshots
+* stdout and stderr
+* TensorBoard scalars, metrics, histograms, images, and audio coming soon (also tensorboardX)
+* Matplotlib
+
+## See for Yourself
+
+We have a demo server up and running [https://demoapp.trainsai.io](https://demoapp.trainsai.io) (it resets every 24 hours and all of the data is deleted).
+You can test your code with it:
+
+1. Install TRAINS
+
+ pip install trains
+
+1. Add the following to your code:
+
+ from trains import Task
+ Task = Task.init(project_name=”my_projcet”, task_name=”my_task”)
+
+1. Run your code. When TRAINS connects to the server, a link prints.
+
+1. In the Web-App, view your parameters, model and tensorboard metrics.
+
+ ![GIF screen-shot here. If the Gif looks bad, a few png screen grabs:
+ Home Page
+ Projects Page
+ Experiment Page with experiment open tab execution
+ Experiment Page with experiment open tab model
+ Experiment Page with experiment open tab results
+ Results Page
+ Comparison Page
+ Parameters
+ Graphs
+ Images
+ Experiment Models Page]
+
+## How TRAINS Works
+
+TRAINS is composed of the following:
+
+* the [trains-server]()
+* the [Web-App]() (web user interface)
+* the Python SDK (auto-magically connects your code, see [Using TRAINS (Example)](#using-trains-example)).
+
+The following diagram illustrates the interaction of the TRAINS-server and a GPU machine:
+
+
+
+## Installing and Configuring TRAINS
+
+1. Install the trains-server docker (see [Installing the TRAINS Server](../trains_server)).
+
+1. Install the TRAINS package:
+
+ pip install trains
+
+1. Run the initial configuration wizard to setup the trains-server (ip:port and user credentials):
+
+ trains-init
+
+After installing and configuring, your configuration is `~/trains.conf`. View a sample configuration file [here]([link to git]).
+
+## Using TRAINS (Example)
+
+Add these two lines of code to your script:
+
+ from trains import Task
+ task = Task.init(project_name, task_name)
+
+* If no project name is provided, then the repository name is used.
+* If no task (experiment) name is provided, then the main filename is used as experiment name
+
+Executing your script prints a direct link to the currently running experiment page, for exampe:
+
+```bash
+TRAINS Metrics page:
+
+https://demoapp.trainsai.io/projects/76e5e2d45e914f52880621fe64601e85/experiments/241f06ae0f5c4b27b8ce8b64890ce152/output/log
+```
+
+*[Add GIF screenshots here]*
+
+For more examples and use cases, see [examples](link docs/examples/).
+
+## Who Supports TRAINS?
+
+The people behind *allegro.ai*.
+We build deep learning pipelines and infrastructure for enterprise companies.
+We built TRAINS to track and control the glorious
+but messy process of training production-grade deep learning models.
+We are committed to vigorously supporting and expanding the capabilities of TRAINS,
+because it is not only our beloved creation, we also use it daily.
+
+## Why Are We Releasing TRAINS?
+
+We believe TRAINS is ground-breaking. We wish to establish new standards of experiment management in
+machine- and deep-learning.
+Only the greater community can help us do that.
+
+We promise to always be backwardly compatible. If you start working with TRAINS today, even though this code is still in the beta stage, your logs and data will always upgrade with you.
+
+## License
+
+Apache License, Version 2.0 (see the [LICENSE](https://www.apache.org/licenses/LICENSE-2.0.html) for more information)
+
+## Guidelines for Contributing
+
+See the TRAINS [Guidelines for Contributing](contributing.md).
+
+## FAQ
+
+See the TRAINS [FAQ](faq.md).
+
+
May the force (and the goddess of learning rates) be with you!
+
diff --git a/docs/contributing.md b/docs/contributing.md
new file mode 100644
index 00000000..7f17422c
--- /dev/null
+++ b/docs/contributing.md
@@ -0,0 +1,54 @@
+# Guidelines for Contributing
+
+Firstly, we thank you for taking the time to contribute!
+
+The following is a set of guidelines for contributing to TRAINS.
+These are primarily guidelines, not rules.
+Use your best judgment and feel free to propose changes to this document in a pull request.
+
+## Reporting Bugs
+
+This section guides you through submitting a bug report for TRAINS.
+By following these guidelines, you
+help maintainers and the community understand your report, reproduce the behavior, and find related reports.
+
+Before creating bug reports, please check whether the bug you want to report already appears [here](link to issues).
+You may discover that you do not need to create a bug report.
+When you are creating a bug report, please include as much detail as possible.
+
+**Note**: If you find a **Closed** issue that may be the same issue which you are currently experiencing,
+then open a **New** issue and include a link to the original (Closed) issue in the body of your new one.
+
+Explain the problem and include additional details to help maintainers reproduce the problem:
+
+* **Use a clear and descriptive title** for the issue to identify the problem.
+* **Describe the exact steps necessary to reproduce the problem** in as much detail as possible. Please do not just summarize what you did. Make sure to explain how you did it.
+* **Provide the specific environment setup.** Include the `pip freeze` output, specific environment variables, Python version, and other relevant information.
+* **Provide specific examples to demonstrate the steps.** Include links to files or GitHub projects, or copy/paste snippets which you use in those examples.
+* **If you are reporting any TRAINS crash,** include a crash report with a stack trace from the operating system. Make sure to add the crash report in the issue and place it in a [code block](https://help.github.com/en/articles/getting-started-with-writing-and-formatting-on-github#multiple-lines),
+a [file attachment](https://help.github.com/articles/file-attachments-on-issues-and-pull-requests/), or just put it in a [gist](https://gist.github.com/) (and provide link to that gist).
+* **Describe the behavior you observed after following the steps** and the exact problem with that behavior.
+* **Explain which behavior you expected to see and why.**
+* **For Web-App issues, please include screenshots and animated GIFs** which recreate the described steps and clearly demonstrate the problem. You can use [LICEcap](https://www.cockos.com/licecap/) to record GIFs on macOS and Windows, and [silentcast](https://github.com/colinkeenan/silentcast) or [byzanz](https://github.com/threedaymonk/byzanz) on Linux.
+
+## Suggesting Enhancements
+
+This section guides you through submitting an enhancement suggestion for TRAINS, including
+completely new features and minor improvements to existing functionality.
+By following these guidelines, you help maintainers and the community understand your suggestion and find related suggestions.
+
+Enhancement suggestions are tracked as GitHub issues. After you determine which repository your enhancement suggestion is related to, create an issue on that repository and provide the following:
+
+* **A clear and descriptive title** for the issue to identify the suggestion.
+* **A step-by-step description of the suggested enhancement** in as much detail as possible.
+* **Specific examples to demonstrate the steps.** Include copy/pasteable snippets which you use in those examples as [Markdown code blocks](https://help.github.com/articles/markdown-basics/#multiple-lines).
+* **Describe the current behavior and explain which behavior you expected to see instead and why.**
+* **Include screenshots or animated GIFs** which help you demonstrate the steps or point out the part of TRAINS which the suggestion is related to. You can use [LICEcap](https://www.cockos.com/licecap/) to record GIFs on macOS and Windows, and [silentcast](https://github.com/colinkeenan/silentcast) or [byzanz](https://github.com/threedaymonk/byzanz) on Linux.
+
+
+
+
+
+
+
+
diff --git a/docs/faq.md b/docs/faq.md
new file mode 100644
index 00000000..5eb8c0fa
--- /dev/null
+++ b/docs/faq.md
@@ -0,0 +1,160 @@
+# FAQ
+
+**Can I store more information on the models? For example, can I store enumeration of classes?**
+
+YES!
+
+Use the SDK `set_model_label_enumeration` method:
+
+```python
+Task.current_task().set_model_label_enumeration( {‘label’: int(0), } )
+```
+
+**Can I store the model configuration file as well?**
+
+YES!
+
+Use the SDK `set_model_design` method:
+
+```python
+Task.current_task().set_model_design( ‘a very long text of the configuration file content’ )
+```
+
+**I want to add more graphs, not just with Tensorboard. Is this supported?**
+
+YES!
+
+Use an SDK [Logger](link to git) object. An instance can be always be retrieved with `Task.current_task().get_logger()`:
+
+```python
+logger = Task.current_task().get_logger()
+logger.report_scalar("loss", "classification", iteration=42, value=1.337)
+```
+
+TRAINS supports scalars, plots, 2d/3d scatter diagrams, histograms, surface diagrams, confusion matrices, images, and text logging.
+
+An example can be found [here](docs/manual_log.py).
+
+**I noticed that all of my experiments appear as “Training”. Are there other options?**
+
+YES!
+
+When creating experiments and calling `Task.init`, you can pass an experiment type.
+The currently supported types are `Task.TaskTypes.training` and `Task.TaskTypes.testing`:
+
+```python
+task = Task.init(project_name, task_name, Task.TaskTypes.testing)
+```
+
+If you feel we should add a few more, let us know in the [issues]() section.
+
+**I noticed I keep getting a message “warning: uncommitted code”. What does it mean?**
+
+TRAINS not only detects your current repository and git commit,
+but it also warns you if you are using uncommitted code. TRAINS does this
+because uncommitted code means it will be difficult to reproduce this experiment.
+
+**Is there something you can do about uncommitted code running?**
+
+YES!
+
+TRAINS currently stores the git diff together with the project.
+The Web-App will soon present the git diff as well. This is coming very soon!
+
+**I read that there is a feature for centralized model storage. How do I use it?**
+
+Pass the `output_uri` parameter to `Task.init`, for example:
+
+```python
+Task.init(project_name, task_name, output_uri=’/mnt/shared/folder’)
+```
+
+All of the stored snapshots are copied into a subfolder whose name contains the task ID, for example:
+
+`/mnt/shared/folder/task_6ea4f0b56d994320a713aeaf13a86d9d/models/`
+
+Other options include:
+
+```python
+Task.init(project_name, task_name, output_uri=’s3://bucket/folder’)
+```
+
+```python
+Task.init(project_name, task_name, output_uri=’gs://bucket/folder’)
+```
+
+These require configuring the cloud storage credentials in `~/trains.conf` (see an [example](v)).
+
+**I am training multiple models at the same time, but I only see one of them. What happened?**
+
+This will be fixed in a future version. Currently, TRAINS does support multiple models
+from the same task/experiment so you can find all the models in the project Models tab.
+In the Task view, we only present the last one.
+
+**Can I log input and output models manually?**
+
+YES!
+
+See [InputModel]() and [OutputModel]().
+
+For example:
+
+```python
+input_model = InputModel.import_model(link_to_initial_model_file)
+Task.current_task().connect(input_model)
+OutputModel(Task.current_task()).update_weights(link_to_new_model_file_here)
+```
+
+**I am using Jupyter Notebook. Is this supported?**
+
+YES!
+
+Jupyter Notebook is supported.
+
+**I do not use ArgParser for hyper-parameters. Do you have a solution?**
+
+YES!
+
+TRAINS supports using a Python dictionary for hyper-parameter logging.
+
+```python
+parameters_dict = Task.current_task().connect(parameters_dict)
+```
+
+From this point onward, not only are the dictionary key/value pairs stored, but also any change to the dictionary is automatically stored.
+
+**Git is not well supported in Jupyter. We just gave up on properly committing our code. Do you have a solution?**
+
+YES!
+
+Check our [trains-jupyter-plugin](). It is a Jupyter plugin that allows you to commit your notebook directly from Jupyter. It also saves the Python version of the code and creates an updated `requirements.txt` so you know which packages you were using.
+
+**Can I use TRAINS with scikit-learn?**
+
+YES!
+
+scikit-learn is supported. Everything you do is logged, with the caveat that models are not logged automatically.
+ Models are not logged automatically because, in most cases, scikit-learn is simply pickling the object to files so there is no underlying frame to connect to.
+
+**I am working with PyCharm and remotely debugging a machine, but the git repo is not detected. Do you have a solution?**
+
+YES!
+
+This is such a common occurrence that we created a PyCharm plugin that allows for a remote debugger to grab your local repository / commit ID. See our [trains-pycharm-plugin]() repository for instructions and [latest release]().
+
+**How do I know a new version came out?**
+
+Unfortunately, TRAINS currently does not support auto-update checks. We hope to add this soon.
+
+**Sometimes I see experiments as running while they are not. What is it?**
+
+When the Python process exits in an orderly fashion, TRAINS closes the experiment.
+If a process crashes, then sometimes the stop signal is missed. You can safely right click on the experiment in the Web-App and stop it.
+
+**In the experiment log tab, I’m missing the first log lines. Where are they?**
+
+Unfortunately, due to speed/optimization issues, we opted to display only the last several hundreds. The full log can be downloaded from the Web-App.
+
+
+
+
diff --git a/examples/absl_example.py b/examples/absl_example.py
new file mode 100644
index 00000000..e2269666
--- /dev/null
+++ b/examples/absl_example.py
@@ -0,0 +1,43 @@
+# TRAINS - example code, absl logging
+#
+from __future__ import absolute_import
+from __future__ import division
+from __future__ import print_function
+
+import sys
+
+from absl import app
+from absl import flags
+from absl import logging
+
+from trains import Task
+
+
+FLAGS = flags.FLAGS
+
+flags.DEFINE_string('echo', None, 'Text to echo.')
+flags.DEFINE_string('another_str', 'My string', 'A string', module_name='test')
+
+task = Task.init(project_name='examples', task_name='absl example')
+
+flags.DEFINE_integer('echo3', 3, 'Text to echo.')
+flags.DEFINE_string('echo5', '5', 'Text to echo.', module_name='test')
+
+
+parameters = {
+ 'list': [1, 2, 3],
+ 'dict': {'a': 1, 'b': 2},
+ 'int': 3,
+ 'float': 2.2,
+ 'string': 'my string',
+}
+parameters = task.connect(parameters)
+
+
+def main(_):
+ print('Running under Python {0[0]}.{0[1]}.{0[2]}'.format(sys.version_info), file=sys.stderr)
+ logging.info('echo is %s.', FLAGS.echo)
+
+
+if __name__ == '__main__':
+ app.run(main)
diff --git a/examples/jupyter.ipynb b/examples/jupyter.ipynb
new file mode 100644
index 00000000..99325b89
--- /dev/null
+++ b/examples/jupyter.ipynb
@@ -0,0 +1,160 @@
+{
+ "cells": [
+ {
+ "cell_type": "code",
+ "execution_count": 1,
+ "metadata": {
+ "pycharm": {
+ "is_executing": false
+ }
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "TRAINS Task: created new task id=e8fc2b809a384c3f8ec3ded54a2aae44\n",
+ "TRAINS results page: http://ec2-3-218-72-191.compute-1.amazonaws.com:8080/projects/ec4476fb59c64d89af880ff0445c836b/experiments/e8fc2b809a384c3f8ec3ded54a2aae44/output/log\n"
+ ]
+ }
+ ],
+ "source": [
+ "from trains import Task\n",
+ "task = Task.init(project_name='examples', task_name='Jupyter exmaple')"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import numpy as np\n",
+ "import matplotlib.pyplot as plt\n",
+ "import matplotlib\n",
+ "%matplotlib inline"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 3,
+ "metadata": {
+ "pycharm": {
+ "name": "#%%\n"
+ }
+ },
+ "outputs": [
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAD8CAYAAACMwORRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzsvXmQHOd5p/l8edZ9dvWNRuMGAfDAQUIED/GSREqUaNmSLHk0Pteyd2zPTuzsbMxEbGxMTMxsTOyud7zhcNgrT8iybOu+LJG6KPEmARAAcZ/E0UAf6Ku6667Kqsr89o8CwAbQ6LOqL+QTwQiiOyvz7e6qX375fu/7e4WUEhcXFxeXlYWy2AG4uLi4uNQfV9xdXFxcViCuuLu4uLisQFxxd3FxcVmBuOLu4uLisgJxxd3FxcVlBeKKu4uLi8sKxBV3FxcXlxWIK+4uLi4uKxBtsS7c1NQku7u7F+vyLi4uLsuSQ4cOjUopE9Mdt2ji3t3dzcGDBxfr8i4uLi7LEiHE5Zkc56ZlXFxcXFYgrri7uLi4rEBccXdxcXFZgbji7uLi4rICccXdxcXFZQWyaNUyLi71plAtkCyPM2olGbJGqdgVBODVvLR6WogZEeJmDEPRFztUF5eG44q7y7LGkQ5XS0OcSJ3hYv4yQoAEPMJAEbUH06o1wrncBRQUVKGyLbyZzcENRIzw4gbv4tJAXHF3WbakKxleH36HvuIApmIQNyI3BP1OVJ0qR1MnOTx+nAci97Izdr+7kndZkbji7rLskFJyNnueN0b2ogiFhBFHCDGj12qKRtyIYkubI6njXMz38NHWJ0mY8QZH7eKysLgbqi7LCiklh1PH+dXwGwQ0PxE9NGNhn4gqVBJmnLJT5of9P+FqcagB0bq4LB6uuLssK06mz7A3eYAmoz4bo0EtgKkYvHj1FyStsTpE6OKyNHDF3WXZkLTGeHN0H3E9iirUup3Xp3rRUHl56DWqTrVu53VxWUxccXdZFlSdKq8Mv4WpGGhK/beKgnqA8XKaw6njdT+3i8ti4Iq7y7LgQq6HEStJSA827BoxI8Kh8aNkK7mGXcPFZaFwxd1lySOl5EjqOAHV19DrXE/1nMteaOh1XFwWAlfcXZY8I1aSsXIKr+pp+LVCWoBj6VPY0m74tZYyUjrIai+ychZZOYWsXkTK4mKH5TILpk1eCiG+AjwPDEspt03yfQH8v8DHgQLwu1LK9+odqMvdy2BpCIGYU8njbDEUg0w1T6qcIW5GG369pYZ0csjKcbBeB5kCOeF3LlSksRthPIhQWxcvSJcZMZOV+1eBZ6f4/nPAhmv/fQn46/mH5eLyAQPFIUzFWLgLSsl4JbVw11siyOpFZPb/htI/g5SgtIPa9sF/Igrlvcjsf8Mp/QIpncUO2WUKphV3KeUbwFQFwC8AX5M19gERIURbvQJ0cRm2RvCo5oJdTxEKw6XRBbveUsCpvI/MfRkwaqKu+G8/SOigtILSAqVfIIs/Rkq54LG6zIx65Nw7gN4J/+679jUXl7pQcqy61rVPhyZUSnZpwa632EhnDApfAxEGJTD9C4QGSgeU30KW3218gC5zYkG9ZYQQX6KWuqGrq2shL+2ynFmExaFcjIsuEtLaD1QnX63fCaGC0gTWr5DGLsQC3nyXCpadoeLksWUFVejoih9TDS12WDeoh7j3A6sm/Lvz2tduQ0r5ZeDLALt27bp7Pj0u88JQdGzpTOv4WC9s6Sxsjn8RkdKC8jsg5mCcJrxg90P1Augb6x/cEsSRNulyDwP5dxkvX0AIhdrqQyClQ8xcT5vvIcLGapRFvuHVQ9x/BPypEOKbwG4gLaW8WofzurgAkDDjjFhJ9AZ0pk6GLW2aPYkFudaiU30fZBnmejMTHmR5L+IuEPd0+Qrn0j/AsrNoiolfa74m7jWkdMiUexmz3sdUI2wK/xohY9UUZ2ws0y6FhBDfAPYCm4QQfUKIPxBC/LEQ4o+vHfIT4CJwHvhb4F81LFqXu5J2bytFx1q4CwpBVL87BnlIewyYR4mp8IG98h01R0tnOD72NaSUBPRWPGr0JmEHEELBo0UJ6G1IaXN87GskS2cXKeIZrNyllF+Y5vsS+JO6RbTCcRyH9EiG8aE0lXIVRREEIn5ibRFM78JVhCwnWjzNtdK8BlMp2hQzJSq2jR41kKZckNr6xcVifnUVKrCyN5/T5cucSX0XjxpBU2bWSGeoARShcSb1Xe6N/fairOCX7bCObKHE+YEklapNZyJMW2xuvt4LgZSSkb4kx14/xYm3TlMtX+9+vCZYQiAdSdvaFh589gHW3Lcaw1z+04GkrFKqXKTqjGPLPAIPquLFo69DU2a+8dTqaSaoBSjZVt1LIotJi8GjKYZPpLHSZSxZJmZE+YrxIwyPztqtHdz38Eba1zShKCuwoVv4gfl049qAt07BLD0cWeVM6vsYanDGwn4dTfHUXp/+Hrua/hRFLKzcLktxP9Fzle+/eZyq41zTR8H29e18as9W1CX2ASxki7z2rbc5+dYZFE0l2hJG02//tUspSY9m+Oe/+hnBaICP/+HTrN6yePm6+VB10uSsI6RLr2HbGWqDTQUIB2q9pgSMHQQ9ezC1rmlvyopQeCC6jTdH9tVN3Kslm0uvDDFwaAwhBJ6wjq/FRDiSteFVeFUPlXKV88d7OX3wEu1rEnzs8w8Ta1lZ6RqhtsyvLkhmQH+oXuEsOVLli1ScPAF9bh25hhogVxkkXb5M1FxX5+imZtmJezpf4ntvHicS8GJeE0lHSg6930d3a4zt65dOif3Vi0N8/y9eopgvkehKoCh3FjEhBMFogGA0QD5d4Jv/9Yc89PHtPP7Zh1HV5VNmli0dYjT/bZA2qhLD1DpvO0ZKm1zlGFnrID5jG82BL6BMsyraGFzH4fHj5KsF/Nr8DMTyIyVOfP0yVrZCoNmDuPZ3KdhFEmb8hoeNbmjEmsNIKUleTfG1//NFnvviI2za3j2v6y8p1DWgxMDJz64UEmqpMllBmCtX3Pvz+9CV+T2ZaIqH/vy+BRf3pbXMnQFn+4axHXlD2AEUIQj7vRw4e2URI7uZgQuDfOO//gChCJpXNU0p7LfiD/toWZ1g/08O88uvvY7jLI8271TxNUZy/4imRDG0DtQ7fCiEUDGUZgy1g2L5FFez/x+2U5jy3IZi8HTL4+TtAs482t4LSYujf9+DXXYItHhvCHtFVtAUjU5v+yTxCiKJEMFYgB999Q3OHrk86+vasspAsZcLubNcyJ1loNiLLRd/MIgQCpgfrvnIzBaZAnV1rWt1BVKsjpEuX8GYRQpxMkwlTKp8iZK9sJYWy27lbpWrTPYUr2sKBWvxPywAuVSe7/23F/H6PfjDc1tlKqpCa3eCw6+eoKkzzs6P3F/nKOtLzjpCMv/PGGobipjZfoEQAkNrx6r2M5z7R1qCvzfla9u9reyI3seh8aMkjPis696dqsOZ7/chbQdv7IP0ji1tyk6FTcH1Uw4CMT06sUSIn/7j2zR3RIkmpv/QV5wyZzInOJ05RtmxbjRHCQSGYrIldD+bQlvRF7GuXhgPIMt7wRmtNSbNBFkEWUR4n1+ye13zpezkEGL+hnVCCAQKZTuHR43UKbrpWXYr967mKFLK2zwt0rki96xa/NpkKSW/+qc3qJQqcxb26yiKQqIjzmvfeofk1fE6RVh/HFliNP8dDLV5xsI+EUNppVg+TcE6Oe2xD8V2sC20mZFycta2vP0HxsheLd4k7FVZpeRYrAt0z2gQiOk1UFTBy9/aP62vimWX+OXgixwe34+pmMSMJuJGgriRIGY0YSom743v45dDL2Etot2BEB6E/3drm6vO0PSVSU4OnDHwfRGhrV6QGBcDR1aYV5noTUhsWa7TuWbGshP3VYkIW7paGEhmKJTKWJUqQ2NZfB6T3fcs/hut//wgZw9cIN4Rq8v5dFNHVRXe+t6+upyvEeStUzhOCUXMzW9dCIGqhEmXXp1WMBWh8FjiYR6MbidZHidXzc/oGnbFofetEbyx2gpZIinaRarSZmNgHTFj5va+0USI3vODDPUm73iMIx1eH/kFY+VRmszmSVfmumIQNxKMWSO8MfKLeaWb5otQoojAvwJ1LTgD4AyCrHxwgHTASdY6UgWIwJdQjNscwFcU9a1uEahiYZ/Oll1aRlEEv/H4faw518+Bs1coVao8uGkVe7Z2EwksfknW4V8dxzD1uj6qRlrCnDt0kUwySyjeuDFzc0FKSbr0Kuo885KqCGFV+yjb/ZNuwk5EEQoPxXew2t/JL4feZNhKElB9eFXPHX/vqUt5qkUbM6JTcixsaRM1wnT5VmEos3vaEEKgGSrH956ntWvyNMZgqZ+h0gAxfeo0hxCCiB5jsDTAUGmANu/UP3sjEUoQEfgDpD2ELB+A8v6awF+/3+qbEMajoK29K7xkdMVfG1oi59fvIKVE4qArjZ0kdivLTtwBdFVl9z1d7L5naZmPWUWLswfO09Ren1X7da7XV58/eokdT91X13PPl4o9SLk6gKHOr0qpltvUyFnvTSvu12nxNPO5VZ/iYu4yh1PHGS3Xyhp1oWEqJqpQkNQ2M/svDFNWyghbEtHDtHgSBLXAnD+0gbCfS2cG7vj9M5njGMKc0fmFEBjC5HTm2KKK+4141BaE93mk5+NAGaQNwkQscJ32YuNV4wT0Viw7i6nOfVFVdrIEjU486sIOf7m7/loNZuxqbTdcUeuf7fL4THrPDCw5cbedLEKodXlSUYSHij0yq9fois6m0Ho2BtcxaiUZtcYYKA0yYo1ScaogBF7Vh3c4yNqmCPFwBLMOtfKGqTE8mqVUsPD4bj5fxSkzUOwlqs/cjCugBRko9lJxyou6uTqRWnu9p35p52WGEIIO3x7Opr83L3GvOAXW+z6x4BvPrrjXkbHBFI7dmDZ5b9DD1QtLz8PDoVy3gQ0CFWeOczqFECQ8TSQ8TdwTvt3E6m+dH6D6FAy1Pp2/16sorFJlEnGvzHosYK2iQlBxKktG3F0gZq5HFR6qThFtDvXulWuvi5hrGxDd1Cy7DdWpyKUL5NJT10s3EqtgNWyRo+oaxdzS8/AQ1G9/QeIgRGP8dRRF1H1qkBC1ksZbUYWK5PaKrqmo5WXlgg4lcZkeVTHYGH6Boj2GPXGDeQbYsoxlp9gYegF1DlVk82XFiHtmLMff/ecf8Hf/+QdkUzOroKg3QhGNG/Ig5awaoRYKVQkgqU+VhyNL6OocfMVnQKw5jFWa3YdzKhzHQSDwBW+vEDIUk5AeoeTM/Cmk5BQJ6xEMxTWPW2rEPRvZEPoUheoIVWdmC6yKU6RQHWVD6JPEPOsbHOHkrBhxdxyJYzs4toN0FmcOiD/sa5i5VLlUIdS0tCplAAy1DV1JYDu5eZ2n1rtQIWDsrFNkN9O5voVivn62waW8RVN7BE2/faUthGBr6H6K9syfIot2gS3hB1ZsQ9Byp9W3nS2R36TiFMlVBinbuduezKSUlO0sucogVafE1sgXaPE9sEgRr6Cce6QpyO/8hxcACMVmMAeyAcTaog3bfCpki6y9f/Hr+G9FCIWw9wlGc99Bncn8zTvgyByG1oqpNaYCavWmVvgR8y5ru042XWTnk1vu+P1V/jUcTr1LoZrHp03t2VKo5vGoXlb51sw7LpfGEfds4iHjX5O03qc//w756hACBYlEIJA4+PUW1gQ/SszcgLrIeycrRtwBIjNoB28k0ZYwhtegXCpjeOr7h61YFVZvmXuZ3KiV4Wymj/FyDlva+DUPa/wtrPa3oCnzy/P6jftIih/gSAtlDjlzKSVVJ0XC9/GGrVwT7VHaVsdJJXOEorM0yLqFasVGUQSbpzAQMxSTJ5uf4+XBH5GtZAhowdt+NikluWoWicNHWz6F4W6kLnlUxaTZu42EZyuF6ghlJ4cjKyhCx1AC+LTEknn6WlHivtioqsquj97P2z98l5au+lkhVKwKps+ge9vsV7WXcoO8M3qay/khFCHQFR0BVKXDgeQ5fJrJ7vhmtkfXYc6xkkRVvMT8nyKZ+w6G2j7reuiKM4ypd+M3753T9WeCEIInfm0XX/+Ln+EPelC1ud/QkoMpHnnufvyhqasn4maCZ9s+zd7ka4xawyhCxbx287NkCUc6NJnNPBx/gohR394Il8YihMCvN+OnebFDuSOuuNeZbY9sZu+PD1KxKuh1GrgxNphizwsPzmqAh5SSt0dO8drIMfyqhxZPdNIVRcku86uhI5zKXOGzqx4jqM+tyzdkPoxtpxgvvjxj8zApJRVnCE2J0hL83Tmt+mdD+5oEDz2zlXdfPkFLV3xOK6yx4TTNHTF2PbV1RsdHjBjPtn6a8fIoF/LnyJRrHkEhI8o6/0aiRtOSWem5rCxcca8zoXiQD3/2YV75p7do6Z7/I1p2PEc4EeLBZ2e3MbM3eZrXho/R7IlMmXbxqAZt3hijVppvXnmdL3Y/hVedfXpACEHU9xyqEiCZ/xFCKGhKHGUSP41aGmYcR+bx6OtoCf42qrIwm8WPPHc/2bECpw9dJNERnfEKXkpJcjBNMOLn1/7wyUk3Uu+EEIKYmSBmLr6xncvdgyvuDWD70/dy4UgPfeeukuice2lfqWBRyBb5F//647PK4Q8Ux3ht6Pi0wj6RJjPMUGmc14eP8WzbrjnFK4Qg7H0cn7GZTOkA2dLbVKSFQEUIFYmDvDbSzatvJOx5HK++fkHb2lVN5dl/8TChpgDvvnwcj88gFJvahqCYt0iNZlm3tYOPfv5hAvN0+3RxWQhEvRs7ZsquXbvkwYMHF+XaC0ExX+K7f/5jBi8Nk1gVn3WJZD5dIJfK82t/9hwbdsyuu+3F/nc5lblMkzm7kXBVx2a8nOXPNr6AT5t/isSRJQrWGSrOCLbMo2CgKiF8xkZ0dfFzlVcvj/Lmi4fpfX8IRM2v3fToIATVcpVSwcKREI762f7RrYQ3R8lULGwp0RWVJo+fhNePqbprJJeFQwhxSEo57QrMFfcGUipYvPL1Nzn2xmnC8eCM/N3tqk1yYIxAxM/zf/xROjfePhloKvLVEn957kfEzRDqLIdZAAyVxvlY6052xBan8WIxSA6m6bswRN/FYcaHM0gp8Qe9tK9NYMVVTuspLmTHrpW71drUBB9YBuxMdLCndTWdgYUbxOBy9+KK+xJBSsnFY5d59RtvMTaYRjdUfCEfps+80XFasSoUcyWKuSKKprL9qXvZ88KDt3mWzITTmV5+0PsOrd65OdDlqkWCmpffW/vROb1+pTBSzPOdC0c5n04S0AyipnfS1E3VcUhaeSq2zZ62bp7r2oxXW/hWc5fGkS+VGUhmsCpVBOAxNDoTkZtGfS4kMxV393mywQghWHd/N2vu7WLg/CBn3n2f3rMDjPSNcn0J6PV76NjQytr7V7Nx5zq88/ClL1YtmIcFgqFo5Kv16+RcjpxLjfB3Zw6gCMEqf3jKfLymKLR4g9jSYd/gFd5PjfKHW3YT87h5+eWMlJKBZIaDZ3s5cmHgxhMbsvbUpmkqD23uYvv6dhLhxWmanA5X3BcIRVHo3Nh+I81iV22qlSpCUdANbUmVwzXMH2cZcCkzxn8/tZ+w4cWvz3wTWxUK7f4QI6Ucf3NyL3967yOEjLlNpnJZXCq2zUv7TvHe+X50VaUp7Ee9xca7UrXZe6qHt05c4qkH1vP4vWuXnPfTivGWWW6omorpNes+tckzhzLGiVQcG38d/M6XI4VqmX84e4iAbs5K2CeS8ARIl0t8/+KJurtQujSeim3z7deO8t77/bTGQiQigduEHUDXVFqiQZojAX753jl+fvDskvt7u+K+wljlq9XW23Ocx5mtFLkvcnd6nPzsyllyVWveK+5Wb5CjyQFOjA3WKTKXheIXB89ytneYtngIZQaLLk1VaI+HePvkJQ6c7V2ACGfOjMRdCPGsEOKsEOK8EOLfT/L9LiHEq0KIw0KIY0KIj9c/1PoipSSXLTE6kmV8LEelYi92SHUhqHvZElpNqjx7l0ZbOihCcE94aY0vXAiyZYt9Q1do8cy/mUoIQdTw8nLfuSW3mnO5M6lckXfP9NIau90HaCoURSERCfDLw+9TqS4dHZk25y5qk3D/CvgI0AccEEL8SEp5asJh/xvwbSnlXwshtgA/AbobEO+8kFIyOJDiyHs9nD83SKlYuZEnkxISzSEe2NXNxs1teL3L18RpZ2w9J9I9VB17VqZgSSvDfZFu/Nrdlys+lryKlBK1TpbNQd1kIJ+hP5+hMzC7fgOXxeHYxQGEEHOy7TZ1jbFMnvf7R9iyurUB0c2emWyoPgScl1JeBBBCfBN4AZgo7hK4bskYBu48OXiRyGSK/PKnxzh/bhBd1wiFvYQn1J07jqSQt/jFS0d57Zen+Mhz93LP1o4ltdE5Uzq8cR5LbOP14eO0eqMzqncfs7JEjABPNC+tGa0LxdnUMD6tfjf06++b/nzaFfdlQMW2eefUZaLBuVWqObaDoai8fuQC93S1LAndmIm4dwATk0l9wO5bjvmPwC+EEH8G+IFnJjuREOJLwJcAuroW7tH/av843/nGPqpVm5bWyUvbFEXgD5j4AyalUoUff+8QvT2jPPPsfaja8tqaEELwaGILVWnzzugpgpqXgDZ5nXbZqTBmZUl4wnyu63F8d+GqHaAnO06gDl25EzFVjUuZMXa33H1pruVGNm9hVSpEZlGGLB1JOpnl6qVhxofTgCQvgbd7eOhj97H5wfXTOoc2knqVQn4B+KqU8s+FEA8D/yCE2Cblzbt6UsovA1+GWhNTna49JSPDGb75j+9gmhqRyMxqjz0enZa2MEfeu4xQBB957r4lcSeeDYpQeLL5Pjq8cd4ZPc3VYhJFKJhKrTqnKm3KdgWPavBYYhu74hvnZBi2EnCkpFCtEDXq+0E0FJV0eenNvXW5HataZTaTdkp5izMHLlDIFtF0FX+wtnhy7CpSFbz67X28/r13eeYLe7j30c2Loh8zEfd+YNWEf3de+9pE/gB4FkBKuVcI4QGagOF6BDlXqlWbl374HpqqEAjMbkWqKIKW1jCHD/awdl0z6ze1NSjKxiGEYFOok43BDoatFKfSV64N63DwqibrAq2sD7ajK267QyO4blPgsvTRFGXGvX+lvMWJt8/g2HJSEzmf3yQY8FC2Kvzsq69TKVfZ+XTjZhXciZl8qg8AG4QQa6iJ+ueB37rlmCvA08BXhRD3AB5gpJ6BzoUTR3oZGkzT1j43zw9FEYQjPn720lH+aG0z+ixsXpcSQghaPFFaPHOzJFjpKEIQNjyUHbuuJmCWXSXudqouC3weAweJI+WUJZBSSs4evIBtO3j9Ny8Yq1KiCXGjBNEwdRKdMV751ju0djfTsa6lgT/B7UybTJZSVoE/BX4OnKZWFXNSCPGfhBCfunbYvwX+UAhxFPgG8LtykWvAHEeyf+/7M07F3Amfz6BQKNNzcVEfQlwaTHcwSq5Srus5K47NmpA7YWk54PcYbOpMkMoVpzwuO5YjnyneJuwARcemy/TclILRDA3TY3Dol8frHvN0zGiZIqX8CbXyxolf+98n/P8p4JH6hjY/RoczZDMlmlvmP1fV69E5cayPDQuUmnEciW07aJqy7HL9y5V7oi0cHb1at/NJWTNxWOU6RS4bPnTPas70Tp1wuHppZNICi+t/7w7zdtEPJ4KcO3yJzFiOUGzhfGhWbLJ1dDRbtwYSn9+gvzeJlLJhYmtZFc6/P8S7+y8wMpwBaumUNWub2fXgGlZ1xSdtg3apD9tiLQgFegujpKp5Ko4NSDShEjMCNJsRjFnsTYxbRTaE4yS8S9NUyuV2VrdEiYd8pPNFwv5JNtelZGxwHN8k5ZJ5xyah6/jU21O3yrV8/tVLw66414PUWL5uRj66rjGWzFOp2BhGfX9lUkqOH+vlVy+foFKxCQRMmltCtZ13R9Lfl+TC+SHCYS+f+vRO2tvdvHm9GS1l2TdykZQzzpXsOGHDi3rtJm7JCqlKnov5IZrNMB3eOEFt6qoaWzrkqhbPdG5ciPBd6oSqKPzmhx/gb3+6j1zRIuC9uTTWtmvFf7cu8AqOjaEobPFPIdxSUrEqdY95KlbsUtBpRMq/Aafcv/cCL/34CIGgh5bWMP7ABzk7RRFEon5a28JUbYev/8M7XLk8Wv8g7mLOpgf5yzOvcCDZw5ZICwlPEEeCrmjoioap6gQ0Dz7VZLSc5XDqIoOl1B2fCqWUXC1keLRtDevCcx+x6LI4tMaC/M5HHsSqVBlJ5XCcD6q5FUVByg9cUx0pydhVdCHYFQzjmaobXAi0BfZ/X7HiHgx6cZz6qLFdddB0dVZDkWfC2TNXee3VUzS3BKd9IggGPfgDBt/7zruMj+frGsfdytn0IP9wYS9BzUOLJ4Sp6twfb0cAxerNqyxFCHyqgVc1OJvrY7A0ftv5pJRcLWbpDsb4eNfmusVZsssUqiXXp2aB6GqO8EefeJh7upoZTuW5OpYhV7Qo2zZG0CRXKpOxqxQcm1Wmh92hCP5J0jHXkVKClEQS89//mw0rNi0TT8zO/GcqCsUybe2Ruvo1Syl58/UzhMJeNO2DN0alVCE5lMYqWEhHYnh0IokQvqAXn8+kkCtz5L3LPPn0lrrFcjcyXMrw9Uv7iRi+m2wH/LrBg82rODTST6ZsEdCNm0rjVKHgU03O5a/i1Uwiuh+olT0OFXNsiiT4l5t24KnDNKbB4hhvjR7lUq620RsxgjzctI2toW53o73BNIX9fObx+/nozhLHL13l5OUhClaZtu5meo/3sjEcotUw0WfgQ1PIFGnqiNGyumkBIv+AFSvuzS0hdF2lUqmiz/NxqJC32PNYffOn/f3jjI3lblTzFLMl+t4fZHRgDClBUQUgao+Fp/oIRgOs2thGOOrn8Hs97Hl0A6bpjnObK+8MX0DApH4yAd1kT+tq3k+P0pdLIwT4VP2GqZgqFHSh0JMfZoO/g1S5iKYo/MbabXyoZXVdzMf6CyN888qv0IRKwoyiCEGhWuLFgbdJl7M8krg7PYAWmpDfwyPb1vDItpoNdrlU5m/+13/CayszEnYpJelkjsd+/aEFvyGv2LSMYWhsf3ANY2OFeZ2nUrFRFMGme2Y3qHrEPkUKAAAgAElEQVQ6Th7vRdNUhBCkR7IcffM0ycFxvEEvgbAPX8CLL+AhEPLhD/so5S1O7nufocsjVCtVLve4ufe5kq9avDd2mZjhv+MxuqKyJdrCntbVdAWilB2bTNkiW7bIVizKtsPVUoqSLPPJ7i38hx1P8UjbmroIu5SSl4cO4FEMokbwxpODT/PQYkZ5J3mS9BwsnV3mj+ExeO73n2B8OINVnLovQkrJ8JUkG7evYfOudQsT4ARW7ModYOeDazh6qIdisTwnC18pJaMjWZ54Zgs+f31NpcbHC5gejVyqwOl3z6ObOvod8u4Cgek10E2Ny6f6iK5KkM/f3XNO58Px8T4cR87IDjmgm2yKJNgYacKyqxSrVSQSVQiy1RKPtHTyePvausaXLGcYsdI0G7fXyKtCRUrJhdwAO2JuNc5isOGBNXzyS0/zk6+8iqKqRBLBm/bjpJTkUgWyqTwbt6/h47//JKq28N3tK1rc/QEPz33yAb77zf1ozcqs0zPJ0Rwdq2Lseqj+d13pSISEC8evoGjKHYV9Ioqi4At6GewZIZ+ZupPO5c6cTA0Q0Gd3sxYIPKqOR/0gFeZRNY6P9/NsR319Q8pOBQVxx8d4VSgUbNeQbDG556H1JDpjHHntFMffOottOzUfIVH7bLeuSfD0Fx5h/f2rF0XYYYWLO8D6TW08+/wD/Pylo4TD3hmtwB3HYWQkS6I5xKc/91BDLH/9fpPLF4fIpQr4wzN3I1TV2spt5Iqblpkr+YqFLub/gdMUlYxVf5EN6wEkEls6k3rx29IhYbqdr4tNU3uMZ37rUR751C4Ge0awSmU0TSXUFCTREVv0Te8VL+4AD+zsJhL185MfHWZwME0k7MUzSZrGcRxS4wXK5SoP7Ozm8ae24PE0ZtPynq0dvPKTIyiKQMzCO1DKWgVN39EepHx00d9AyxEHWRe7RnH9XHXGr3nYGlrDicxFmo3oTX/jXLWIX/OwJlDfPSCXueMNeFizbdX0By4wd4W4A3SvTfB7f/QEJ4/1cmDvBYaHai3+12uHhSJASjZubmfn7rV0dDbW8GnN2gSVYhlllrXzpbJNS8xPOV/CKpbx+Oq7F3A34FMNUuUCzHPxXpUOXrUxN/8nWrYzXs7SVxzGVIxaKsax8CoGn1n15KysEFzuTu6qd4jXa7Br9zp2PLiWdCpPcjRH2aoiFEE47CXWFGzYSv1WNE2lNeKhZzCLac7Ms8Z2JLYjaW8KUBrLYq+Qod7zRUpJpmAhpSToM6etWFkXaua1wTME9PlNnUqXi2yJNMZMzquafK7rKXryg5zO9FBxqnT7W9kUWn1Xzrh1mT13lbhfR1EE0ViAaJ1MfLKVDMnyKKPWELlqDhCE9BBxI0HCbManTV5yt2ZVjHSmRKZUIeDRpxR423EoWFXWtYUJenVKUqIv0I1oqSKl5OTlIV47foHhVA4hIOj18OF717Jzfecdm862x7p49eqZab27p7t22bHZ3VTfSpmJaIrK+mAH64MdDbuGy8rlrhT3ejFiDXE09R59xSsgQRUa2rXH5SuFHiQ1X4pu31rui+wgatyc6tn4QDdXLw7h83sYGi+gKOAxtJsEx7YdStdW6Bs6IrREfWTH87SvbcG4y5uY9p6+zEsHThPxe2mN1jqSi1aFH+49yeB4lucfumfSG2bM9LMx3MLlXJK4ObcbfL5q0ewNssrv+rXPhLJVwSpY+ILeRaseudtwxX0OVJ0qR1IHOZE+iqGYxPSmO666HenQW7xMT+EiOyIPsTV8H8q1Coh7HlrPmz88wNrWIG1xH1fHCgynCh8YlIla2VtXc5BExIvnWrlkMVfiwd9+fCF+1CVLOl/i5++doyUaRJ/g6+E1ddr1EO+evcIDa9tZlZi8quSxlo2cSb9BxbHRZ1DvPhFbOoyXC3y+Y+G7DpcbdtVm70+OcOiVk9hVG6/fw2O/tpOtH9rg/u4ajCvus6TiVHht+GX6ileIGnHUaUrqFKEQ1qNUnSoHx/eRqozzSNOHUYRCMOrnngfXc+bgBRIdMTZ0GHS3hLAqNo6UqIrAY2ioE9ILuVSeQMRH95a7+1H95JVBpJQ3Cft1FEWgayqHL/TfUdzXBJr4ROe9vNh7jFZveMYCb0uHq8UUj7Vs4L5o57x+hruBt198j30/O0qiI4qma5QKFj/56huYPpMN969e7PBWNCvWfqARSCl5J/k6/cU+4kZiWmGfiKZoxI0E53NnOTT+7o2vP/HZ3UQSQZKDKQB0TSHg1Qn5DPwe/SZhz2cKlAoWL/zxRxbcPnSpMZYtTCrs1/EYOqOZqd0z9yTW81zHvQyWMmQqxWldF/NVi4FCij3N6/lYxzZ35TkNVrHMe6+eItERu/F+9fhMgjE/+392dJGjW/m44j4LevIXuZg7T8yIz+mDLYQgZjRxIn2EoVLN6c8X9PLZf/MJ4q0RBntGyI7nbxOZUsFiqHeUSrnK5/7n52lb01yXn2c5E/F7r01LmhyrXCUy2TSdCQgheLx1I7+77mGCuoeBUprhUoaKU62NTZOSqmMzauUYKKZQhcLn1jzIJzrum7S5yOVmirlSzebhlnJfr99k7NpixqVx3N3Lv1lgyyr7x94mqIXntWJThIJP9bM3+RYvtH8GIQTBqJ/P/7tPcvH4FQ784hiDPSM3Kj0k4A95efKzH2LzrnX4w/Mb+L1S2NLVwi8On8O2ndvGD0opsapVdqyfWepqU7iNjaFWBoop3h3t4WSqn5Jd83M3FY0NwWY+1LyO1f7Yjf0Sl+nxh33ohkbZqty0+Z9LF2jrTixiZHcHrrjPkP5iHyW7NOfqiol4VR9j5VFGrGGaPS0A6IbGpp1r2bhjDWNDafLpAtKRmD6DRKc7P/VWYkEfj29by6tHzxMP+fEaNfEoV6qMpPPcv7adrsTMRxIKIejwRfl0V5RPd22/NvBYumI+D3RD45Hnt/PLb+4l3BTE4zfJpfKU8mX2PL9jscNb8bjiPkMuZM9hKLN3lpwMIQSKULlSuHhD3Cd+L94aId7qeodMx9P3ryfkM3n92EUGx7OAxNA0ntm+gUe3rpnXcBUhZmcL4TI525/Ygukz2ffTI4z2j9OxtplH/nAnHetapn+xy7xwxX0GSCkZsgbxqjM3+JoOr+pl8Fre3WVuCCF4aGMXO9Z1kszkcaQkHvRh3OWbzUsJIQRbd69n6+71ix3KXYf7KZgBFVnBckr4tfp0tAIYikmqcvscTpfZo6kKLdHgYofh4rKkcBOKM8CRDnWxEZyAQFw7r4uLi0v9ccV9BmhCA2Rdp8/b0kavUw7fxcXF5VZccZ8BmqIR0sOUnalnJs4GyymRMN1yMBcXl8YwI3EXQjwrhDgrhDgvhPj3dzjmc0KIU0KIk0KIr9c3zMWn3bOKoj11x+NMkY5DcnyMFsMduODi4tIYphV3IYQK/BXwHLAF+IIQYsstx2wA/gPwiJRyK/BvGhDrorIusAFb2nVJzaRG0lw+1YcxPLkVsIuLi8t8mcnK/SHgvJTyopSyDHwTeOGWY/4Q+Csp5TiAlHK4vmEuPnGjiRZPG7lqZv4nC9s8vPVh1q9rnBe4i4vL3c1MSiE7gN4J/+4Ddt9yzEYAIcTb1IaX/Ucp5c9uPZEQ4kvAlwC6urrmEu+iIYTg4fhj/PPAd6k4FXRlbl7qll1CU3U+tuVZ19fapS7YtsNg3xgjg2n6Lo0yNpKhWq3ZMkTiATq7m0i0hmnvit/m8+KycqlXnbsGbACeADqBN4QQ90opb3IHklJ+GfgywK5du+o/WbjBRIwoe+KP8dboa0T0KNosBb7sWOSqWZ5pee6O05lWImXb5sz4MG9cvUh/Po0jJc3eAI+1reXeeCte7e4eOjJX8rkSZ472cuDNs+SzJaQEj9fA8GgIIahWHfoujXD+VD8Aptdg5571bN3RTSjiehStdGYi7v3AxNHende+NpE+YL+UsgJcEkKcoyb2B+oS5RJiQ3AztrTZl3wLj+qdUWOTlJJcNUtFVnii+SN0+pbXU8t8GCxk+crpdxmzigQ0gyaPH4EgXy3z7QtHefHyKX5n0y7WheOLHeqyQUrJ2eO9vPzDw1ilCuGYn+b2yX10vD6DULS2kChbFfa+cpr9r53lyU/cx7Zda1zPohWMmG6DUAihAeeAp6mJ+gHgt6SUJycc8yzwBSnl7wghmoDDwANSyuSdzrtr1y558ODBOvwIi8OINcSbI6+SqabxKj68qu82t0gpJXk7h2WXaDITPNL0xG2j9lYyo6U8f3n8LaQDMc/kK8VcxSJbsfiTbXvoCs7c6GuuVBybC+kxLmfHuZAeY7SYx5YOpqrR6Q+xJhRjTThGhz+0JP3arVKFn3//EGeP9RJpCuDxzr5XomxVGRvO0LW+med/czf+oDtwezkhhDgkpdw17XEzqf4QQnwc+Atq+fSvSCn/ixDiPwEHpZQ/ErVPwZ8DzwI28F+klN+c6pzLXdyhNm6vt3CZk5ljJK0RhFD4YEYeSCStnna2hu6lzds5q+EeK4G/P3OQM6lhWrxTWwOkrCIB3eTfPvB4wwS1UCnzzuAV3ui/RL5aRhGCgG5gqtqNbuFCtUKxWrP6bfeHeGbVerbFW+c8RLvelIplfvC1t7l6ZYymtvlZT0spGRvOEo76+MzvP04wXD/fJJfGUldxbwQrQdzzxTKZfAnHcajKCo5RRDMkQgg8qpewHl7yXajDIxm+86NDKELwmRd2kojXx6Nl3Crwfxx6hTZfaFpxlFLSV8jwZ/fuoTtY/yeb91OjfOPcEdJliyaPH482dTZSSkmmbJEul9gSa+Yz6+4l6llc8bOrNt//2tv0XhqlqaV+TxVjIxkisQCf/9ITmB5372M5MFNxd43DZoHjSPqGUxw508fF/iSZfAlFuZ6zlEhZG9Dc3Rbjgc0dRDuW/q9336FLFAoWjoSDh3t47pl763Le8+nkNT/06UVICIEuBCeSg3UVdyklr/Rd4KWeM0RML52B8IxeJ4QgbHoIGSYX0mP8+eE3+B+2PkR3qPFpoztxeO8FLr8/THNHpK5PN7FEiOGBFO/88hRPPn9/3c7rsvgsffVZAkgpOd87yi/2nSGZyqNrKkGfSWs8eNsHrVyxudA/yslLgwR9Jk8/tJF717fPy1u8kbS1hDl2sg8hoLV5ZuI3E3IVa1Z+6LqikSlbdbs+wCt9F3ix5zTt/hDaDAdgT0QIQYsvQKZs8Tcn9vMn936IVcGF99lPDmd4/efHibXc/n6rB/GWEAffPseGbR10djfV/fwui4Mr7tNQKJV5ed9ZDp/tI+z30NYUmvJ4Q1eJ6bXNw6JV4QevHOPkhat84tGthINLL6+564HVJOIBEILVnfVbNXtUbcpuXls6KIgbYmVLB49av7fjufERXuw5M2dhn0jIMAHJV04f5H/Z/jh+fWFTbXt/dQpNU9Ab5FOvqgq+gIfXf3KM3/ofn1ySG8kus8etg5qCdK7I3/1oP8ffH6AtHiLgM2f1eq+p054I0XN1jL/9wV6GktkGRTp3hBB0dzXRvWpuQ7/vxKpAFIS4TeCzRYvD5/t549hF3jrZw+XhcRwpKTs2m6L1MVIrVMp84/2jRE3vvIX9OiHDQ65c5sWeM3V1B52ObLrI2RP9RGL1myUwGcGwl6t9Y4xcTTf0Oi4LhyvudyBXsPiHlw6QyZdoiQfnnFYRQpCIBADJ1158l9FUrr6BLlE6/CE6/WHS5dKNrxXLFQ6f7ydXsgh4TQxN4cLAKOcGRvBrBhsj9RH3twcvkylbBI3Z3Yyno9Uf5N2hK/Tn62BBMUPOHq81hysNrkcXQqBpKife62nodVwWDlfcJ0FKyY9fP0EqW6IpXJ9O0nDAi5SS7/zyCJWqXZdzLmWEEDy/+h6yFYuSXQVgcCyLIyVeQ0cAqqLg9eicHxvlIx0b0euwyi7bNq/3XyLhrX8HsCIEmlDZN3il7ue+E5fODeL11/cmdSf8IQ+Xzg4uyLVcGo8r7pNw7P0Bzl4ZpjlaX4GIhnwMj+V4++jFup53qbI+0sRvb9rJWCnPUCFLplRCu1Zd5EhJzimTp0K3E+WeQH1W7efTSUrVCmYd8/cTafL6OTjcR+laPXwjkVJytXccr39hcvymR2c8mccqNf5nc2k87obqLVjlKj975wzx8O0dp/WgORrgzfcu8sDGTiJLcIO13tzf1E6zN8Deocu8mDpNyi7ilwZSQpsWpE2G8BgaoUB9uiR7MuMoonFrFk1RsB3JUCHH6gaXRuazJaqVKtoCGcwJIVAUQWosR8sd7AwazeholjOnBxhL5tB1le41Cdata8EwXamaLe5v7BbOXR7GKleIhRojvJqqIAQcPdfPh3feHRPh2/whfn3tvTzRso6//tk7jOeKNPn8yDIULIuPPrIVXa2PgF1MJxtezSKRDBayDRf3Stmm3rN7p0MIqFYWPm2Yz1v89KUjXDw/jKoq6IaKIyXHj/ViGBpPP7ONbfd1upU8s8AV91t459ilWVfFzJZo0Mv+E5d55P6acdOIleZE6jKj5TSWXcGjGqzyJdga7iKorxz3vpjfx//08cd490wvZ3qHCMe8fOieLta116+2eqSUx6c1VtxVRWGkWJ+pXFMhBAut7bXrLvBFi4Uy3/r6XsbH8zS33t59W7aqvPTiYaq2zfYd3Qsa22wp2RWSVpaKU0UTKmHDR1BfnCd0V9wnUCiVGRrL0hqrTwv+nTB0jbFMkSODPZyq9NBXGEURAo9qoAiBU3Y4nxvg1eGjbAl1sadpCwlP/RqMFpOA1+Sp7et5antjnlpsKWl0v5iCwF6AckjDoyOd2mD2hVqxSikxPAsrC/v2nmd0NEtL6+TvccPUaEoE+NXLJ1i7rplweOkteIZLad4b6+FQ8iKOdG58XQL3hDt4ML6OLn+8oSnDW3HFfQKj4/mbGmsahUQy4h3hm319dEZjtHhubykP6X4c6XA228+ZTB+f63qMNYHWhsa1lHCk5HIqxYVkkqxVpuo4BA2DzkiYjU1NGHdI4xiKii1lQ9/YtpSYdUojTYXXZ+ALmFTK9oLknB3HQSAaXlM/EcuqcPi9HmLxqa+p6xpSwqkT/Tz8yIYFim56bOnw2uBJ3ho+gyIU4mbgpt4KRzqcz17lZLqXjcE2fr3rITzqwmyQu+I+gfFsAWcBVmSD6ghDwSHWWi2E9DtX5ChCockMUahafOvKG3yx+yk6fSu7PbxYqXBiaIhXL15iJF+72eqqggCqjkPVqZVSPt69mh3tHcR8Nz/ytvuDXMmlG1YtA7UPbKuvsU93UNvg7Fgd58qFkQUR91KxQlNbaEGnNQ30j2NXHfQZXDMU9nL8WO+SEXcpJT/rP8L+5HnavBHUSVblNcEP1ixMskP806W3+OKaxzDVxpu0uaWQE6hU6zMAeyqKosQlvRev7WHC09uU+DQTn2by/b63qTort0b+ajbL//PWO3zn+AkqVZvOUIj2UJCE30+T309rMEhnOERA13n5/AX+rzff5NjgzXXZ68LxG7a9jUK55jmzEKzb3LZgpYn5dJGN2zoX5FrXKZftGe8raJqCZS2dMs33xi6xf/Q87XcQ9okIIWjxhOgvjPHTgSMLEp8r7hMQovEpmSF1FBAoUpnVtQKal1y1xKX8ymwy6U9n+Ku9+6jYVTpCIQLmnR9dTU2jPRgkbJr8/XuHebev78b31oZiOFI27CZdqlbwqDoJ78KI+/otHWi62vAKFsdxavnh+xd2SphpTu1BNJFKxcbrXRq2xLZ0eGP4NHEzMOM8uhCCZk+Io+OXSZcLDY7QFfeb8Jr6tRKFxmBjM6iN4JEmjpSYxuwetb2qwf7k2QZFt3iMF4v894MHMVSVqHfmlQVeXafF7+fbx05wdmQEgK5ghBZvgFyl3JBYx0pFPtyx5kYzVqMxPToP7F7L+GhjfYlSo3k2bG1f8Nmq7R1RdF2jUqlOe2wmXeK++1cvQFTT05MbIVMp4p1lZZYiFASCY+ON73J2xX0C8Yi/oZUWGSWPIxzUa7/2wCxHpIU0H735EXLVYiPCWzReu3iJYqVC2DP7RiZT04h4TH5w8hTOtaqSp1etJ2UV6756t+wqihDsbO6o63mn48HHN+HxGhTz9bVEvk7ZqiCl5PGP1cfLfzYYhsbOXWtIJqcuLS1bVVRVsGXrwv7u78TR8R4MZW77IBHDx4Hk+TpHdDuuuE8gHvIBAtuZYTJ8llRFFWStWkZK8HlmJ+7X00Yle+nkHedLoVzh3b4+Er65Wz0ETZNksUjP+DgADzS1sT4SZ7SOtehS1rpSP7lmMxFzYeuWfX6Tj/36TtJjeWy7vu9NKSXJ4SxPPn8fkWkqVhrF7ofX0d4WYXgwjePcfkMuFsskkzme/fj9BJbIvNfxcn7Om/amopGpFm8qmWwErrhPQNNUtq5rZTzTuJWxpDbQI+gz8cwyLQPX954WZzRiIzg2OEjVcdDm6XpoqipvX74M1JqMPrv+PiS1oSH1YKiQY0MkzsOti5MWWLu5jQcf38TwQAqnTgIvpWR4IMXW7V3cu2tNXc45F0xT57Of/xD3bOlgdDjL8GCasWSO0dEsg1fTVCsOn/6NB9m6wJu9U+HI+TV7CUTDP8VuKeQt7NqyimPnBhrSOKJJFSFq4r6+s2nW+X0pJQ4SU1kam0r1YF9vL2Fz/h3BcZ+PE0PDFMoVfIZOwuvnD7Y+yJdP7EdK5mz/K6VkuJCjyevnX27agbpAufZbEULw+LPbsG2HQ2+do6k1jD7LxYEjbHLeNHkzQ9ZIky1nCW8MkNpg8+pIkTZPK6v9q/Br9XfUnA6PR+f5F3bw2BObef/cIKmxPLqusWp1nNWrm1C1pbUODeom4+UsMPv3VVU6mIo2bYXNfHHF/RY6myOsao0wOp4jGqrv5lLA8VOtOJi6TmwOVsK5aokWT5SAtnQNx8qVKucvj3DkdD9Fq0xHc4QHtnTSeocJVuPFIkFj/k0diqitowqVMj6jdvNbH47zx9s+xN+dPsjVfIYWX3BGM11v/Cx2laFCjnXhOL+9eUfd/eFni6IoPPX8/URifl7/6TF0QyMc80+7CHGEzWhokNHwAI5iUyk5yBJ0rWqhY3UCB4feQh8Xchd5a1SwLrCGXbEdhPWpp441gnDYx64H1y74dWfL1vAqzqSvEpnDW3fcyrMj3vgnJVfcb0EIwScf38bffPdtKlUbvY6OfKqj4MsFad6oo84hDZGvlvhI6/Yla56UL1h8/cWDDI5k8HkNdE3lyOk+Dh6/zNN7NvHw9ts/tOWqjWLW6ecRUL4lZbE2HOPf7fgw/3zxJEdGB9AVjSaPb8oVuGVXSZbyKCh8et1W9rSuXrQV+60IIdj5yAa6N7bw8+8dYuBKEt3QiMT8kw70KBg5+hLnsbQiMqsiywp+v5cN29sJTDDHM691TTrS4VL+MpdyPexJfIh7gpuW7PttMdkYasNUNcpOdVYbq1JKbGmzI+aK+6KQiAb4yIc28dO3T9HeFEKpwwdbSsngWJYntmzjkvcSVcee1Qi4ol3Go+qsD7bPO5ZG8eJrJxgZy9E2YdC2z2tQtW1efvssbYkw3Z3xm17j0TQcKanLLVSKSW0JQobJFzdt57H2Nbxz9TJHRgeQ1P4muqIihMCWDraUCGrzXz/atZFdiU6inqX5lBRPhPjCHz3B1StjHNl/gTPH+0DWbBs0VUEogpw/xXDiEkpRRa0YROMB2lbFCEV8d5wspgiFmBGh4lR4ffgtxsrj7InvXlBPlOWAqersblrPG0OnafPebh9yJ5JWjq5AgpYF8Ipyxf0O7N62mvFMkf3He2htCs5r5Sal5Goyy+buFn59933sHffxxshxWj3RGX1oLLtCqpzjN7sen3P5VaMZS+V5v2eElqbb2/I1VcXr0dl3tOc2cY/7fYwVCoTn6dVSvVbhFDAm348QQtAditIdivLC2i0MFrIM5rOMlPLYjoOparQHQjR7A7T4AnWZCtVohBC0r47TvjrO05/azthIluRwhvHRHEknyXH/RdaanYT9AXx+c1a2ArqikzDjnEidRBc6u+O7GviTLE8eTWymJzdCf3GMZvN2N8tbSZUL6IrGp1c9uCBPQ0tTKZYAQgg+9vBmTF3ljcMXCPs9c7ICLloVxtIF7t/Uzicf24amqTya2ILlVNifPENED+DTJj+vlJJstUChWuaFzoeX9Kp9eKw2G/ZOb9pQ0MOlvtHbvv7o6tX809Gjc6pxn0iyUGRnRzseffrNZr9usC4cZ104Pu2xywXTo9O2Kkbbqhhlp8x3en9At2zDr81936jmixLjcOooXb5O2rx3j3HdTDBUjS90P8J3r+zj/cwgYcNLQPPc9hko2WXGrAIRw8cX1zxKxFiYDWtX3KdAUQRPPfT/t3fnwXGe92HHv7/3ffe+cR8EAR4QRYqnSFGHbR3RYUm1pbi2WjkTx07Vuk7q/JOZTtNmxpNxptOmM03aTtxp1CZ1nMvXxK4SSeNLsiVLlixKpCkeIs2bAEgCBBaLvXff9336x4ISDwC7IHaxOJ4PhzOL3XfffR7s4rfP+xy/5xbWr2njuy8f5MLlKeKRQGUlaxXFkk0yncfrMXnq4Z1sWd/1/ptuiMFDnTvp8MV4bewwFwtJvIZF0PQhCC4u2XIBG4feQBuf7NtBX7A+29A1SrWGiOuqGVd1bu5or/RdOs6smR6vcFyXQtnGmc5e6LFMfNNjImXX4a6+vpsu/0ryTvIAWTtLm29hX16uUowVkyjl8uPRV3mq7xNYS/TKsVkClpdPD3yIo6lhfjp2jAv5SaDSyKnMuIOQ5eejPdvZnugnNEtDrhH0O1WDgZ4WvvCpD3HwxAiv/+I0F8anEAS/18Lvsyo52FUloOeLZUDh93m4f/dGdt26ZsYWv4iwI7GebfEBzuXG2D9xkrFiipKy8RseBlt62RlfT7svtiwGtNZ0VvodHcedcQ2j+bcAACAASURBVLB4MpVjx+YZ5ikr2NXVxStnz9IXi2KZ5jWzh4tlm4upNJdSGTKF0vSXiMD06lPTNPB5LNZ3ttAdac4inKWk4BQ5lDpCwhtf8LnSdpYT6XN4DQ+WGAznR+gPLW7umeXAMky2JdayNd7HSD7JaCFF3injMyzi3hD9obZ5ja/VrVy1HCQijwL/HTCB/6OU+s+zHPdJ4NvAHUqpfXUr5RLg93nYe1s/uzf3MXRpkguXpzh7YYLRiQxl28UwDbrbogz0tNDVFqW/K1HT3peGGAyEOhkIdS5CLRonFPSxd3s/P9t/iq722DUDdrl8CQXcsa0fx3U5NTrBu+cucmYsyXg2h+06nEyN897wKBGPl1gwQDzkZypXZHSq0t3jsyzCPu+Nl7x2mYlsjrZUkP/yg1d5Ytut3NbTuSy+EBvhTPYsjnIxZeHBJGQG6PK3EbIC+E0/76YO6+A+BxGhN9hCb7Cl2UUBQKrl3xAREzgOPAwMAW8Bn1ZKHbnuuAjwPOAFvlgtuO/Zs0ft27ei4n9D2M446dy3KNvnscweoqGnsMyOZhdrRrbj8v2fHuWdw+eBSreW4yoCPg+/+vB2JkoFXjpyknS+iNcyCfm8+D0WIkLOLvPz0WEKto1TdhjPVHLDtIQCJMKBGRd8FB2bkuOws7ubtlCIbKlEMptnx5puntyxmVAd5s8vN9+/+CMu5C8S8dT3KkYpxUQpyW+u+4zummkyEXlbKVV1hLuWd2kvcEIpdWr6xF8HngSOXHfcHwJ/BPzbeZZVm4Wr8iTT/xNXpTElTtk5x0T6T2mL/jsMY/FXEVZjmQaP33cbd+4Y4MTZMQrFMu0tYRItIf5h/1FOjyVpDQfpSdy4OCZoebizo5dXz5/lYjqHZQpe0yJdKJItlmmPhgh4PYCi5DgUbQePabCnt5f4dCbJkNdLwOPh8MglRtMZ/sU9u4kucKB2OVFKcbFwCb9Z/37dK1dCk+XUgvvytcVRy/y+XuD8VT8PTd/3PhG5HehTSj0/14lE5PMisk9E9o1Np2jVZmfbw7juBJbRgYgXy2jHddOUnfPVn9xErfEQd+4Y4L69g/ijPv7spTe5mErTm4hOB+iZZQpFsGFNJELI46XsuiiBsnI4NzHJhdQU6WIJj2GytbOTD/f3vx/YrzBE6IpFGM/k+OrP3qFoV08lu1I4yqHgFLCkcS3rrN34PORafSz4UyAiBvDHwOeqHauUehZ4FirdMgt97ZXPQN2QXkixXPK9nRlL8n9/vI9IwEdojs03AIq2zaHRUYIeC49pEvH6KLtupZtGubhKUSzb9IYibO7uqNqn3hENMzI5xQ/eO8HHtt5az2otWe70Z6WR4w03fh61paqWKDEMXD3HbM30fVdEgK3Aj0XkDHAX8JyI6FUPC+Sx+vCYayk7wzjuFLYzgmX24LGWxoYFc5nKF/jrn75D2O+tGtgBjo1dxlUKz1XTIT2GQcTrJe7z0+IP0BkKc2EiTTJbW9bOzmiYn/7yDGcnJm+6HsuJJWYl22ADt4r0NPCqQKuvWoL7W8CgiKwTES/wNPDclQeVUimlVJtSakApNQC8ATyx0mbLNIOIh0TkCwR992KabQR899AS+W0MaW4Cq2qUUjy//z1KjkvYX72smWKJS9kMoSoLkAxD8HssjgxdouxU33bONAz8Xg8vHztVc9mXM0MMEt4ERbcxm3oARJuQTEy7OVW/hpVStoh8EfgelamQf6GUOiwiXwb2KaWem/sM2kIYRpBo6BPNLsa8XJnqONPA6UxG0lOVrI41dCd4LZN0oci5y5Ns6Kw+sJcIBjh2aYzxbI7W0OJuIdcMPYEujk4dw2/WdyC57Np4DA/hJqQD1m5OTddYSqkXgBeuu+9Lsxx7/8KLpS1nrx8/i9/rqSlYK6UYmUoTsGq/3A96vQyNpxhoS1TNrnklxe+J0XFa16384L4+vI53U9dPZFu4tJ1mW2zrql0/sBwtj5E5bdlIZvMcGxkjEaotm2LBtrFdd16J2UyjshXi5XRt2+j5PBanxydqPv9y1unrIO6JkavjrBZXubjKZVNkY93OqTWeDu5aXQ1NpABq3hQjV66ka5gv0zS4nKktgIW83lUzqCoi3NN6Jxk7W7c9OidKk2yObiJeh5QG2uLRQ99aXQ2Np+bVCq9sRj7/S32vaZLK1TZrxjINUvnSvF9juVoT7GVz9FaOpY/T6m1ZUFdKxs4StALc0XLt5LepZJYjb5/hvQNnKRXLtLRF2fmhQQY2dc8rtbDWODq4a3V1fiJVU9bMD9xc4DENg0yhOGuismuom32V5UlEuLttL5PlSUYLY7R4EzcV4LN2Dtu1eaL3n1yz6vXIO2f43jfeqOxNGwtgWQajIxP8v6++SltXjE88cx/RhB54bTbdLaPVVbFcxpxHIPHcxHaDMJ1iWCppaaspOQ6RGqZkriRew8ujXQ/TE+hhrDhO2S3X/FylFOPFCRSKj/c+fk26gTPHLvD8X79OrCVER0+cQMiH1+chmgjRuSZBaiLD3//vH1Mq1v56WmPo4K7VlcyzjRycntveyIU3uVKJ/pbV11/sM3081v0wH267m7Sd4XJxgpI7e/eUoxySpUkul8bZEFnPU33/lHZf2/uPK6V45fkDROIBvLNcnbV0RBkfneLEoaG610ebH90to9VVNOgnlS9Q66RDr1nZcMN23WtWp1bjKoUgNfXvl2yHda2Jms+9khhisDW+hXXhfn6ZPsnB1CGmyhmE6aseqfwer/wbjGxgc/RW2n1tN3TljI1MMjaSpKN37t9lKOrn7VeOsWV34zeB1mang7tWV+vaExwfGSMerG0qpIiwJhrl5ERyXsHddlzCfu+sGz1f4bguIsJgR9ucx610ISvEzsR2tse3krbTJEspCk4BhcJjeIh7YsQ8UTzG7OMl6VQOwzSq9t8HQz7GL6XqXQVtnnRwX+Vsx+Xs0DjvnbrE+ZEkqXQeAWLRAGt7W7h1fSdre1qqD1pO645H5j162RWJcHIiiatUzVMoy45DZ6x6zvLxTI6da7pXXZ/7bAwxiHlixDyxeT/XNA1q6T1zXYVp6R7fZtPBfZVSSvHL06O8+JPDpLOVzTOCAS+JaKXFXSzbHDw6xNsHzxGPBXj8ga2s76ve+l3bFsdjmZRsB28NO1EBBDwe+mIxzqcmifiqB2GlwHEV7dG5g3vJdnCV4v5bdPdAPXT0JjBEcGx3zuA9lcyyaYfesanZ9NfrKmTbDs+/fIhv/OPbGCJ0t0dpTYQI+D2YpoFpGgT9XtoSYbo7ojiOy9985+d879WjOM7cC2O8lsXdG9cykZ3fCskNLS34LYtCDfnXy45DyOclFpj9i0ApxWg6w2O33UKH3lu1LoJhP5t3DzAxOjXrMY7tUirabL97cBFLps1EB/dVxnFcnvvRu+w/fJ6ujijBQPV0vOGgj672CG/uP82LPzmM6859bX7Hhj4EmddGGZZpsK2rC9t1Kc2R8VEpKJRt1ne0TM+HnOkYxcjkFLd1d3D3et2CrKcPP7aDaEuIsZFJXPfaL/pioczoSJI7H7qNrr6lsY/oaqaD+yrzzqFzHDo2Qnd7tOb+bQDDMOhuj/LOoXO8e2wYpUqoWZa3J0IBHtt5C2NTmXlNcYz5/dze00PZccmV7RmfmyuVaI+E6IjOvEim7DgMTU6xubuDp/dsn9dqWa26UMTP07/9EBu39TE2kuLS0ETl/3CSYr7EQ5+8gw8/ul0nGFsCdJ/7KjI5leOHr79He0to3n98ClCkCYeH+M733yYUSRIKKDxmL0H/r+D33YYhH6SZ3buhj6PDY5wem6ArFqn5dRKBAHvXrOHw6CWmikWCHg/WdIAulm1MMdjU035Dq10pxUQuT7Fs8+iWQT6ycWBes2+02oWiAT7+mQ+R/vhOhk9fxrYdQhE/fRs6deqBJUQH91XkF0eHQIHXM7+3XWFTKB3Cdi5iGCaOE+D02V523qZwVJpU9q9I54IkIs/g9WwAKukBnr5nB1/9yT4uTKbpjIZr/kIJ+7zcsWYNQ6kUp5NJcuVyJc2AGNyxfg2+6fLbrkuuVCJXLOMoxWBHK49uuYXeuN5QYjFE4iFu3aXTDCxVOrivEq6r2HfwHPFobfPPr1C4FIoHsN1xDIkiIkQjcOAw7NoqmBLFNKI4bprxqa/QGv037wf4oNfD5+7bwzd/9guOXbhMeyT0fmCuxhBhbTxOVzjMycsT5B2bvrYYmXKJbKqyytIyDPoScTasa2FbbxftYR1oNO0KHdxXiVQ6T6Fkzzu4l+wz2O7l9wM7gM8LY+OQy8OVtUqmUel6Sab/nPbEl97vogl6PfzGR3az79QQzx94DzenSAQDVYO847pMZHKUHJdHbhvk4W2DBLweSo5D2XEwRPBZ1rzGDTRtNdHBfZW4sjhpPhQu5fIZDLm2j14EDAOmMh8Ed6gE+LI9RKF4mKB/9/v3G4awd2Mfm3raeef0MK8dP/v+VEmfZeGxTIRKN0uhVEYhiMDO/h7u3NjHmpYPFtx4TROv7kvXtKp0cF8lqs1Pn/k5l1GUMGTm1r47wykNI0qu8MNrgvsVsaCfB27bwIc3DTCcnOLSZJozl5Ok80VcBX6vRX9rnO5ElN5ElJC/+jRNbX7yTo6SWyRgBvEaetXuSqaD+yrh9c7/rXZUmrlmy3pmSENiSISyM4RSDiIzt7A9lslAe4KB9gR3Dup56IthsjTBgcmfMZw/h4EBIqwPbWJHfC9+c35ddbXKlscYzu3jcvE4gtDu30xP8HaCVvWNzbWF08F9lWiJh1BKoZSqfRqkcmZM4euqyv/4DDMcZTrRusJG0N0nS0GqPMH3L34HhSLhqWR7dJTDicxRxooXebjzSXymv/qJ5mG8cJLDk99CEHxmDBSM5N7mQm4/21ueJubVX+qNpld4rBJBv4fWRIjcPLabE/GhuLHvJZuD7o6ZW+5K2SAmgu5SWSoOJH+Oi0vUE3//i90UkxZvG5OlcU5nj9X19Wy3yNHUd/AaEYJWO6Z4MQ0vIasDywhwZPLvcVXtq5e1m6OD+yohIty1az1TmULNz7HMVkDdsFI0l4ddt838HNu9TNB3h16huEQUnDzD+TNErJmzQIatKO+l363ra44XT+CoIh7jxu4erxGi5OZIls7U9TW1G+ngvops3tBJNBIgky3WdLwhYUyjBaU++EKYykBLHAb6bjxeKQWqTND3oXoVWVugslsCBENm/lP3GB6KTu1f+LXI2xNzdskppSjYOt97o+ngvor4fB6efHg7U5kC5fLsybmueY5nEEUZpWxKZSgU4KP3wvXT1JVS2M4wPu8OLHNNA0qv3Qy/GcAQA2eWbpCCkyfuqW+SL48RnLE77woBPEZ9+/i1G+ngvsr097by+AO3MTqRplDDJsamkcDv3U4un2ciWeSR+xSd7dce46oitnMer7WReOjTuktmCfEYXgbDW0iVkzc8ppQi52S5Nbqjrq/Z6h9ElMzYr+6oMoZYJHw6x36j1RTcReRRETkmIidE5PdmePx3ReSIiBwUkR+JSH/9i6rVy+5t/Tz1+G5y+RKj42nsOebA27bDRDKMKdv5xGMhNg4MU7aHsZ2LlO0RyvZ5lJshHHiMlujnMWboZ9Waa2tsN63eDi4XRyk4eRxlk7UzjJdG2RjeTF+wvoHWb0YZiNxH1r5Eyc0ClS+SkpMhZ4+yIfowHqPWXXa1myXVUrJKZbLyceBhYAh4C/i0UurIVcc8ALyplMqJyG8B9yul/vlc592zZ4/at2/fQsuvLcBUpsBr+05y4MgQrutiGIJnevekUtlBKYVpGty+dS0f2r2eUNBH2b5AsfwerjuFiBeP1YvPcysienbMUlZ2S5zJ/pJj6UPknSwxTwubo9vpDQzM2h+/EEopRguHOZt5lbw9CaIIWe30hz5Ce+DWur/eaiIibyul9lQ9robgfjfwB0qpj07//O8BlFL/aZbjdwF/qpSac1RNB/elI5cvMXxpkotjUyRTWUBojYfoao/S2xXH75t902RNm4tSLkU3gyB4jdozg2qzqzW417KIqRc4f9XPQ8Cdcxz/DPBiDefVlohgwMvgQAeDAx3NLoq2wogY+E2dgrkZ6rpCVUR+HdgD3DfL458HPg+wdq1eoaZpK51SioMXL2K7Lrt6enQWz0VUS3AfBq6e1bxm+r5riMhDwO8D9ymlZpxIrZR6FngWKt0y8y6tpmnLyrlUiq8dOIDrKsI+H5va2ppdpFWjluD+FjAoIuuoBPWngV+7+oDpfvY/Ax5VSo3WvZTaqpUspTg89UtOZc4BsD60lttigyS8M6+41JaWqM9HxOvDdl3ifj23fTFVHVAFEJHHgf8GmMBfKKX+o4h8GdinlHpORH4IbAMuTD/lnFLqibnOqQdUtWpOZ87z4sWfoFBEzMouSxknh0LxaNd9bAjXr2vPth2+++IB+noS3Ll7fd3Oq0GuXMZVirBXz6iqh3oOqKKUegF44br7vnTV7YfmXUJNm0O6nOXFi68QMgP4zQ/yjvtML0WnxPcuvsJn+n+ViCdcl9ezHZez58cxTb2ur96CM2WYW+ZcVcJ20yhVRsSDZUQwlth0YJ3yV1uSjqVP4SrnmsB+hc/0knayHEufYk/L9rq8nt/n4YvPPIBp6TTF2uxKzjipwj6S+ddRVFZ4KxSCRcJ/FzH/HfispTHrTAd3bUkayl8gOEeO8YDh51zuQt2CO1Ry72jaTJSyuZR9nsn864CB12y9pqXuqjLJwmtM5F8h6rudrsgnmt6S18FdW5IMMXCZfTxI4TZkZaWmXU8pm+H035IuHsJv9iAzfO4M8eAzu1BKMVV8B9tNsSb2uaYGeP3XoS1Jg+EBCs7sqYnzTpHB8MDiFUhbtUazL0wH9t4ZA/vVRASf2UOufJKL6b9fpBLOTLfcVwHbcRmZnGJ0KsNoOourFFG/j+5YhJ5ElJBvaQ0EQWXK48+sA0yVM0SvGzRNlzMEzUBdZ8to2kzKTpJk/nX8ZnfNqROuBPip4n5a7fvxWV0NLuXMdHBfwYplmzdPneenx8+QLZVRSmEZBiJSyQQpldzaO9f2cN+mAdqj9Zl5Ug8+08uTPQ/xjyM/YrQ4jkc8CFBSZcJWiCd6HpxxsFXT6mmqeABg1s3eZyMiiJhMFvbRGf5YI4pWlQ7uK9S58Um+8eZBJnI5WkNBooGZBycd1+Xg+QvsPzfCY9tu4Z6N/RjG0lgi3uqL82v9T3I2O8y53AigWBvspT/Ui8fQH12tsZRymMi/ise8uc1MPEYrk4U3aA8+jGEsfkNE/4WsQMcujPG1194h6PPSG597JadpGHREw5Rsh3848B5j6SxP7tqyZAK8x7DYGOlnY0RvEaAtLkflcFQejxG/qecb4kHhYLtTeI326k+oMz2gusKMTE7xV6/vJx4MEJultT4Tr2XSG4/yxslzvPzeqQaWUNOWB1eVERbWyBEEl+o7njWCDu4rSNlx+PZbh/BZFgHv/OdsG4bQHYvyoyMnGE7qDYy11a3S8l5YfkOFatp0SB3cV5Ajw5e4MDlFInTzW91ZpoHfsnjx4PE6lkzTlh9TglgSwnHzN/V8V5Uw8GAZzclnr4P7CqGU4ifHThOtQ+a9RCjAqbEJxqYydSiZpi1PIiaJwIcpuzduLl6LsjtB3H+3brlrCzNVKHIxlSHsX/gH6cp83rPjkws+l6YtZ1HfLqAyc2Y+lHJRyiHur5q8sWF0cF8hxqayGFC3PSp9lsXpsYm6nEvTliuPGaMlcB8FZ4Ra0qND5Sq66IwQ9+/Fay3+LJkrdHBfIbLFEm4dz+e1TMazN9fXqGkrSXvoEWK+3RSc4aoteKVcCs4wYe9mOsJzbmnRcHqeu6Zp2hxETLojT2EZMSbyPwHAY7RgGh+Mb7mqSMmZAFEkAnfTEfoYhjQ3y6gO7itEyOet62VYyXZoXcCsG01bSURMOsKPkQjcw1RxP8n8qxSc6YFWBab4aAs+RMy/66ZXtNabDu51UijZJDM5HNfFMk0S4QA+z+L9etujIVwq/X316Hcv2jbr2pfGh1TTlgqPGaM1eD8tgY/guFlcShh4MY0gIksrnC6t0iwz6XyRd89e4OfHzzOeziFUVqQh4LqK9liIvbesZevaTsKBxuaWiPp9dMXCZAolIgt8rSsDR/2tN7fsWtNWOhETy2zO/PVa6eB+E2zH5c1j5/jBgeM4riIe8tMdj1zTYlZKkSuWeWHfUb73zjEeuf0W9g72YRqNGcMWEe7btI6vv3lwwcE9mc2zvr1lSWWJ1DRtfnRwn6dMvsjfvXKAs6NJOmJhPLPsuSkihPxeQn4vJdvmH39+lKPnR3n6IzsJNmg7ty29nXTHoySz+ZtepWo7LgXb5rHtm+pcOk3TFpOeCjkPuWKZr730NsMTU/S0RGcN7NfzWha9LVHOjib565ffplCyG1I+j2nyqTu2UrRtcqX5JytyXcWF1BQPbtlIb2JpX3JqmjY3HdxrpJTihX1HuTiZoTMWnvegpYjQGQtz/vIkP9jfuLwtPfEon7lnF6lcnlS+UPPzSrbD8OQUd21YywO3rm9Y+TRNWxw6uNfo+PAY+0+N0Bm/+X5oEaEzHuGN4+c4falxqz83dbfzrx+4E59pMjyZolCevRXvuC6jUxnGszk+vvPWJZXLXdO0m6f73GuglOKHB08QDfgwFjjN0DQMwj4vLx08wTMP761TCW+0tjXO7zx8z/vb7CWzeRTMuM3erv4e7r1laW2zp2nawujgXoMLyTQXJ9J0JyJ1OV8s5OfMpSRjqQztscYFVJ/H4t5N67hnYz8XUlNcSmUYu7JBdsBHVyxCT3xpbpC9VEwUs0yVCrT4QkS9C8+4qWmLRQf3GgyNTVZauXVKylU5j2J4fKqhwf0KyzToa4nT16Lnrc/HG6Nn+O65gwiCJQa/sfEOBmMdzS6WptWkpj53EXlURI6JyAkR+b0ZHveJyDemH39TRAbqXdBmOjOaxF/n1aZey+Ls2M3lidYaL1nM8d1zB2nzh+kORgl6vPzNqX04qp7p2TStcaoGdxExga8AjwFbgE+LyJbrDnsGSCqlNgJ/AvxRvQvaTMlsHq9V7+BukszorItLVbpcxEDwGpXpriHLS8GxydvN2Q9T0+arlpb7XuCEUuqUUqoEfB148rpjngT+cvr2t4EHpV59GJrWBC2+IB7DJF2uTCe9XMjS5gsRtPT4hLY81BLce4HzV/08NH3fjMcopWwgBbTWo4BLQSzop2zPbyeWakq2QzykB+iWqrDHx28O3olSiuHcJDGvn88O3rng2VKatlgWdUBVRD4PfB5g7dq1i/nSCzLQmeDI+dG6nrNkO6xtT9T1nFp9DURa+Q87PkrRsfGbVt0G1DVtMdTSch8G+q76ec30fTMeI5W8lzFg/PoTKaWeVUrtUUrtaW9v3vZT87WmLY5SquZttqqpnEfR2xqry/m0xjFECFgeHdi1ZaeW4P4WMCgi60TECzwNPHfdMc8Bn52+/SngJVWvSLgE9LZE6YxX0unWQypXYG17go5YqC7n0zRNu17V4D7dh/5F4HvAUeCbSqnDIvJlEbmySeCfA60icgL4XeCG6ZLLmYjw4I6NpLJ53AV+Z7muIpMv8sD2Dbo1qGlaw9TU566UegF44br7vnTV7QLwVH2LtrTcuqaD7et6OHLuEl0LWKl6aTLNnsE+NnStmPFmTdOWIJ04rEYiwsfu2ExbNMhoKjPv5yulGE1l6GqJ8NHbN+lWu6ZpDaWD+zyE/F4+++AeOuJhhsdTlJ3apkeWbYeR6Rzwn/2VPQ3brEPTNO0KnVtmnqJBP//ykb28duQMLx08CUAi5MfnsW7YZq9QtpnM5BFDeHT3Ju7a1I9l6u9TTdMaTwf3m+AxTe7ftoGd63s4ePoCbx4/T3IyXQnuivc3yE6EAzxy+y1sH+gmGtQLljRNWzw6uC9APBTg3q3ruXfrerKFEslMDttVWKZBSziou180TWsaHdzr5Mpm2JqmaUuB7gDWNE1bgXRw1zRNW4F0cNc0TVuBdHDXNE1bgXRw1zRNW4F0cNc0TVuBpFmZeUVkDDi7wNO0AZfrUJzlYjXVdzXVFVZXfVdTXaH+9e1XSlXdEKNpwb0eRGSfUmpPs8uxWFZTfVdTXWF11Xc11RWaV1/dLaNpmrYC6eCuaZq2Ai334P5sswuwyFZTfVdTXWF11Xc11RWaVN9l3eeuaZqmzWy5t9w1TdO0GSyL4C4ij4rIMRE5ISI3bL4tIj4R+cb042+KyMDil7I+aqjr74rIERE5KCI/EpH+ZpSzXqrV96rjPikiSkSW7SyLWuoqIv9s+v09LCJ/u9hlrKcaPstrReRlEdk//Xl+vBnlrAcR+QsRGRWRQ7M8LiLyP6Z/FwdF5PaGF0optaT/AyZwElgPeIFfAFuuO+a3gf81fftp4BvNLncD6/oAEJy+/VvLta611nf6uAjwCvAGsKfZ5W7gezsI7AcS0z93NLvcDa7vs8BvTd/eApxpdrkXUN97gduBQ7M8/jjwIiDAXcCbjS7Tcmi57wVOKKVOKaVKwNeBJ6875kngL6dvfxt4UJbnDtRV66qUelkplZv+8Q1gzSKXsZ5qeW8B/hD4I6CwmIWrs1rq+q+AryilkgBKqdFFLmM91VJfBUSnb8eAkUUsX10ppV4BJuY45Enga6riDSAuIt2NLNNyCO69wPmrfh6avm/GY5RSNpACWheldPVVS12v9gyV1sByVbW+05evfUqp5xezYA1Qy3t7C3CLiLwmIm+IyKOLVrr6q6W+fwD8uogMAS8Av7M4RWuK+f5tL5jeiWmZEpFfB/YA9zW7LI0iIgbwx8DnmlyUxWJR6Zq5n8oV2Ssisk0pNdnUUjXOp4GvKqX+q4jcDfyViGxVSrnNLthKsBxa7sNA31U/r5m+b8ZjRMSicok3viilq69a6oqIRT9VcQAAAX9JREFUPAT8PvCEUqq4SGVrhGr1jQBbgR+LyBkqfZXPLdNB1Vre2yHgOaVUWSl1GjhOJdgvR7XU9xngmwBKqZ8Bfip5WFaimv6262k5BPe3gEERWSciXioDps9dd8xzwGenb38KeElNj2IsM1XrKiK7gD+jEtiXc58sVKmvUiqllGpTSg0opQaojDE8oZTa15ziLkgtn+PvUmm1IyJtVLppTi1mIeuolvqeAx4EEJHNVIL72KKWcvE8B/zG9KyZu4CUUupCQ1+x2aPMNY5EP06lFXMS+P3p+75M5Q8dKh+KbwEngJ8D65td5gbW9YfAJeDA9P/nml3mRtb3umN/zDKdLVPjeytUuqGOAO8CTze7zA2u7xbgNSozaQ4AjzS7zAuo698BF4AylSuwZ4AvAF+46r39yvTv4t3F+BzrFaqapmkr0HLoltE0TdPmSQd3TdO0FUgHd03TtBVIB3dN07QVSAd3TdO0FUgHd03TtBVIB3dN07QVSAd3TdO0Fej/AyxQ3Xk54htLAAAAAElFTkSuQmCC\n",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {
+ "needs_background": "light"
+ },
+ "output_type": "display_data"
+ },
+ {
+ "data": {
+ "text/plain": [
+ "[]"
+ ]
+ },
+ "execution_count": 3,
+ "metadata": {},
+ "output_type": "execute_result"
+ },
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYYAAAD8CAYAAABzTgP2AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAGANJREFUeJzt3X+Q3HV9x/HnK0mBnkwlITcR82MvSvxBtQXZopUZRyFAtB1CW8Tg0UaLc2Mran/YGnoz0kFvBttOoe1YxysiUTIEm+qQWizys/5RoFwkAwEKScNuSAxwEqC1V4Ph3v1jvxd3j93cj+/+/H5fj5md2+/n+/nevhdy+97v56ciAjMzsykLOh2AmZl1FycGMzOr4cRgZmY1nBjMzKyGE4OZmdVwYjAzsxpODGZmVsOJwczMajgxmJlZjUWdDmA+li5dGgMDA50Ow8ysp+zYseNHEdE/U72eTAwDAwOMjY11Ogwzs54iqTybem5KMjOzGk4MZmZWw4nBzMxqODGYmVmNpiQGSTdIek7SrgbnJelvJe2R9LCkd1Sd2yhpd/LY2Ix4zMxs/pp1x3AjsO4Y598PrEkeQ8CXASQtAa4C3gmcBVwlaXGTYjKzDNmyZQsDAwMsWLCAgYEBtmzZ0umQMqspiSEivg8cOkaV9cDXo+J+4CRJpwAXAHdExKGIeAG4g2MnGKviPxTLiy1btjA0NES5XCYiKJfLDA0N+d98i7Srj2E58HTV8f6krFF5bs32w95/KJYnw8PDTExM1JRNTEwwPDzcoYiyrWc6nyUNSRqTNDY+Pt7pcFpiLh/2/kOxrJjNl6F9+/bVvbZRuaXTrsRwAFhZdbwiKWtU/ioRMRoRxYgo9vfPOKO7J83lw95/KJYFs/0ytGrVqrrXNyq3dNqVGLYDv5OMTnoX8FJEHARuB86XtDjpdD4/KculuXzY+w/FsmC2X4ZGRkbo6+urKevr62NkZKTlMeZRs4ar3gzcB7xZ0n5Jl0v6uKSPJ1VuA/YCe4B/AH4fICIOAZ8HHkweVydluTSXD3v/oVgWzPbL0ODgIKOjoxQKBSRRKBQYHR1lcHCwHWHmT0T03OPMM8+MLLrpppuir68vgKOPvr6+uOmmmxrWLxQKISkKhULDembdqlAo1Px7n3oUCoVOh5ZJwFjM4jO2Zzqf82Cu34oGBwcplUpMTk5SKpX87cl6ju98u5MTQ5fxh73lSauaiDzHJx1V7i56S7FYDO/HYGb1TI10qu7U7uvrc58EIGlHRBRnquc7BjPLFM/xSc+JIQd8W2154jk+6TkxZJyXzrC88Ryf9JwY2qRT39p9W21545FO6TkxtEEnv7X7ttryxpPh0vOopDYYGBigXC6/qrxQKFAqlTL72mbWXTwqqYt08lu7b6vNbK6cGNqgk51hvq22TvBIuN7mpqQ28IQbyxP/e+9ebkrqIv7WbnnikXC9z3cMZtZUCxYsoN7niiQmJyc7EJFN8R2DmXWEJ5j1PicGM2sqj4Trfc3awW2dpCck7ZG0qc75ayXtTB5PSnqx6twrVee2NyMeM+sc96n1vtR9DJIWAk8C5wH7qWzReWlEPNag/ieBMyLid5PjH0fEiXN5TfcxmJnNXTv7GM4C9kTE3oh4GdgKrD9G/UuBm5vwumZm1gLNSAzLgaerjvcnZa8iqQCsBu6uKj5B0pik+yVd1IR4zMwshXZ3Pm8AtkXEK1VlheTW5sPAdZLeWO9CSUNJAhkbHx9vR6xmlnGeoV1fMxLDAWBl1fGKpKyeDUxrRoqIA8nPvcC9wBn1LoyI0YgoRkSxv78/bcxmlnPeq6SxZiSGB4E1klZLOo7Kh/+rRhdJeguwGLivqmyxpOOT50uBs4G6ndZmZs3kGdqNLUr7CyLiiKQrgNuBhcANEfGopKuBsYiYShIbgK1ROwzqrcBXJE1SSVLXNBrNZGbWTN6rpLHUiQEgIm4DbptW9rlpx39e57p/B97ejBjMzOZi1apVdfcq8Qxtz3y2adwZZ3nhGdqNOTHYUe6MszzxDO3GvLqqHeVtQM2yzaurtkHWml3cGWdm4MQwb1lsdvFyyWYGTgzzlsUx0O6MMzNwYpi3LDa7uDPOzMCdz/Pmjloz6zXufG4xN7uYWVY5McyTm10sj7I2Es/qc1OSmc3K1Ei86kEXfX19/kLUQ9yUZGZNlcWReFafE4OZzUoWR+JZfU4MZjYrngCZH04MZjYrHomXH04MZjYrHomXH01JDJLWSXpC0h5Jm+qc/4ikcUk7k8fHqs5tlLQ7eWxsRjxm1hqDg4OUSiUmJycplUpOChmVegc3SQuBLwHnAfuBByVtr7NF5y0RccW0a5cAVwFFIIAdybUvpI3LzMzmpxl3DGcBeyJib0S8DGwF1s/y2guAOyLiUJIM7gDWNSEmMzObp2YkhuXA01XH+5Oy6X5L0sOStklaOcdrzcysTdrV+fzPwEBE/BKVu4LNc/0FkoYkjUkaGx8fb3qAZmZW0YzEcABYWXW8Iik7KiKej4jDyeH1wJmzvbbqd4xGRDEiiv39/U0I28zM6mlGYngQWCNptaTjgA3A9uoKkk6pOrwQeDx5fjtwvqTFkhYD5ydlZmZdJU8LCKYelRQRRyRdQeUDfSFwQ0Q8KulqYCwitgOfknQhcAQ4BHwkufaQpM9TSS4AV0fEobQxmZk10/QFBKe28gUyOWTXq6uamc0gKxtzeXVVM7MmydsCgk4MZmYzyNsCgk4MZmYzyNsCgk4MZmYzyNsCgk4MNi95GrpnBvlaQDD1cFXLn7wN3TPLG98x2Jx571+zbHNisDnL29A9s7xxYqjD7efHlrehe2Z548QwzVT7eblcJiKOtp87OfxM3obumeWNE8M0bj+fWd6G7pnljddKmmbBggXU+28iicnJyZa8pplZO3itpHly+7mZ5Z0TwzRuPzezvHNimMbt52aWd04MdeRp6ruZh2fbdE1JDJLWSXpC0h5Jm+qc/yNJj0l6WNJdkgpV516RtDN5bJ9+rZm1jodnWz2pRyVJWgg8CZwH7KeyTeelEfFYVZ33AQ9ExISk3wPeGxEfSs79OCJOnMtregc3s+bIys5kNjvtHJV0FrAnIvZGxMvAVmB9dYWIuCcipiYH3A+saMLrmllKXt7E6mlGYlgOPF11vD8pa+Ry4LtVxydIGpN0v6SLGl0kaSipNzY+Pp4uYjMDPDzb6mtr57Oky4Ai8JdVxYXk1ubDwHWS3ljv2ogYjYhiRBT7+/vbEK1Z9nl4ttXTjMRwAFhZdbwiKashaS0wDFwYEYenyiPiQPJzL3AvcEYTYjKzWfDwbKunGZ3Pi6h0Pp9LJSE8CHw4Ih6tqnMGsA1YFxG7q8oXAxMRcVjSUuA+YH11x3U97nw2M5u72XY+p97BLSKOSLoCuB1YCNwQEY9KuhoYi4jtVJqOTgT+URLAvoi4EHgr8BVJk1TuXq6ZKSmYmVlreRE9M7Oc8CJ6ZmY2L04MZmZWw4nBzMxqODGYmVkNJwYzM6vhxGBm1kRZWMY89TwGMzOrmFrGfGKismbo1DLmQE/NJvcdg5lZkwwPDx9NClMmJiYYHh7uUETz48RgZtYkWVnG3InBzKxJsrKMuRODmVmTZGUZcycGM7Mmycoy5k4M1nJZGL5nNluDg4OUSiUmJycplUo9lxTAw1WtxbIyfM8sT3zHYC2VleF7ZnnixGAtlZXhe2Z50pTEIGmdpCck7ZG0qc754yXdkpx/QNJA1bkrk/InJF3QjHjqcTt3Z2Rl+J5ZnqRODJIWAl8C3g+cBlwq6bRp1S4HXoiIU4FrgS8m154GbAB+EVgH/H3y+5pqqp27XC4TEUfbuZ0cWi8rw/fM8qQZdwxnAXsiYm9EvAxsBdZPq7Me2Jw83wacq8rmz+uBrRFxOCKeAvYkv6+p3M7dOVkZvmeWJ80YlbQceLrqeD/wzkZ1IuKIpJeAk5Py+6ddu7zei0gaAoZg7s0QbufurMHBQScCsx7SM53PETEaEcWIKPb398/pWrdzm5nNXjMSwwFgZdXxiqSsbh1Ji4DXAs/P8trU3M5tZjZ7zUgMDwJrJK2WdByVzuTt0+psBzYmzy8G7o6ISMo3JKOWVgNrgP9oQkw13M5tZjZ7qfsYkj6DK4DbgYXADRHxqKSrgbGI2A58FfiGpD3AISrJg6TeN4HHgCPAJyLilbQx1eN2bjOz2WlKH0NE3BYRb4qIN0bESFL2uSQpEBE/iYgPRsSpEXFWROytunYkue7NEfHdZsRjZp67Y/PntZLMMshrVFkaPTMqycxmz3N3LA0nBrMM8twdS8OJwSyDPHfH0nBiMMsgz92xNJwYzDLIc3csDVXmmfWWYrEYY2NjnQ7DzKynSNoREcWZ6vmOwczMajgxmJlZDScGMzOr4cRgZmY1nBjMzKyGE4OZmdVwYjAzsxpODGZmHdKtS6OnSgySlki6Q9Lu5OfiOnVOl3SfpEclPSzpQ1XnbpT0lKSdyeP0NPGYmfWKqaXRy+UyEXF0afRuSA6pZj5L+gvgUERcI2kTsDgiPjutzpuAiIjdkl4P7ADeGhEvSroR+E5EbJvL63rms5n1uoGBAcrl8qvKC4UCpVKpJa/ZrpnP64HNyfPNwEXTK0TEkxGxO3n+Q+A5oD/l65qZ9bRuXho9bWJYFhEHk+fPAMuOVVnSWcBxwH9VFY8kTUzXSjo+ZTxmZj2hm5dGnzExSLpT0q46j/XV9aLSJtWwXUrSKcA3gI9GxGRSfCXwFuBXgCXAZxtcjqQhSWOSxsbHx2d+Z2ZmXaybl0afcc/niFjb6JykZyWdEhEHkw/+5xrU+wXgX4DhiLi/6ndP3W0clvQ14DPHiGMUGIVKH8NMcZuZdbOpJdCHh4fZt28fq1atYmRkpCuWRk/blLQd2Jg83wjcOr2CpOOAbwNfn97JnCQTJIlK/8SulPFYD+vWoXtmrTI4OEipVGJycpJSqdQVSQHSJ4ZrgPMk7QbWJsdIKkq6PqlzCfAe4CN1hqVukfQI8AiwFPhCynisR3Xz0D2zvPFGPdYVOjF0zyxvvFGP9ZRuHrpnljdODNYVunnonlneODFYV+jmoXtmeePEYF1hcHCQ0dFRCoUCkigUCoyOjnbNKA2zPHHns5lZTrjz2czM5sWJwczMajgxmJlZDScGsx7iZUOsHWZcRM/MusPUsiETExMAR5cNATx6y5rKdwxmPWJ4ePhoUpgyMTHB8PBwhyKyrHJiMOsRXjbE2sWJwaxHeNkQaxcnBrMe4WVDrF2cGMx6hJcNsXbxkhhmZjnRliUxJC2RdIek3cnPxQ3qvVK1e9v2qvLVkh6QtEfSLck2oGZm1kFpm5I2AXdFxBrgruS4nv+LiNOTx4VV5V8Ero2IU4EXgMtTxmNmZimlTQzrgc3J883ARbO9UJKAc4Bt87nezMxaI21iWBYRB5PnzwDLGtQ7QdKYpPslTX34nwy8GBFHkuP9wPKU8ZiZWUozLokh6U7gdXVO1Uy3jIiQ1KgnuxARByS9Abhb0iPAS3MJVNIQMAQet21m1koz3jFExNqIeFudx63As5JOAUh+PtfgdxxIfu4F7gXOAJ4HTpI0lZxWAAeOEcdoRBQjotjf3z+Ht2hm1tvavXhi2qak7cDG5PlG4NbpFSQtlnR88nwpcDbwWFTGyd4DXHys683M8mxq8cRyuUxEHF08sZXJIdU8BkknA98EVgFl4JKIOCSpCHw8Ij4m6d3AV4BJKonouoj4anL9G4CtwBLgIeCyiDg80+t6HoOZ5cXAwADlcvlV5YVCgVKpNKffNdt5DJ7gZmbWxRYsWEC9z2lJTE5Ozul3ec9nM7MM6MTiiU4MZmZdrBOLJzoxWM/x9paWJ51YPNF9DNZTpm9vCZVvT15l1Gxm7mOwTPL2lmat58RgPcXbW5q1nhOD9RRvb2nWek4M1lO8vaVZ6zkxWE/x9pZmredRSWZmOeFRSWZmNi9ODGZdwJP2rJvMuFGPmbXW9El7U8sqA+47sY7wHYNZh3nSnnUbJwazDvOkPes2TgxmHeZJe9ZtUiUGSUsk3SFpd/JzcZ0675O0s+rxE0kXJedulPRU1bnT08Rj1os8ac+6Tdo7hk3AXRGxBrgrOa4REfdExOkRcTpwDjABfK+qyp9MnY+InSnjMes5nrRn3SbtqKT1wHuT55uBe4HPHqP+xcB3I2LiGHXMcmdwcNCJwLpG2juGZRFxMHn+DLBshvobgJunlY1IeljStZKOTxmPmZmlNOMdg6Q7gdfVOVUzli4iQlLD9TUknQK8Hbi9qvhKKgnlOGCUyt3G1Q2uHwKGwJ1yZmatNGNiiIi1jc5JelbSKRFxMPngf+4Yv+oS4NsR8dOq3z11t3FY0teAzxwjjlEqyYNisdh7CzyZmfWItE1J24GNyfONwK3HqHsp05qRkmSCJAEXAbtSxmNmZimlTQzXAOdJ2g2sTY6RVJR0/VQlSQPASuDfpl2/RdIjwCPAUuALKeMxM7OUUo1KiojngXPrlI8BH6s6LgHL69Q7J83rm5lZ83nms5mZ1XBisEzzctZmc+dlty2zvJy12fz4jsEyy8tZm82PE4NllpezNpsfJwbLLC9nbTY/TgyWWZ1eztod39arnBgsszq5nPVUx3e5XCYijnZ8OzlYL1BE7y07VCwWY2xsrNNhmDU0MDBAuVx+VXmhUKBUKrU/IDNA0o6IKM5Uz3cMZi3gjm/rZU4MZi3gjm/rZU4MZi3Q6Y5vszScGMxawPs4Wy9z57OZWU6489lsDjznwOxnvIie5Z4X2zOrleqOQdIHJT0qaVJSw9sTSeskPSFpj6RNVeWrJT2QlN8i6bg08ZjNx1wX2/PdhWVd2qakXcBvAt9vVEHSQuBLwPuB04BLJZ2WnP4icG1EnAq8AFyeMh6zOZvLnAPPaLY8SJUYIuLxiHhihmpnAXsiYm9EvAxsBdZLEnAOsC2ptxm4KE08ZvMxlzkHXsrb8qAdnc/LgaerjvcnZScDL0bEkWnlZm01lzkHntFseTBjYpB0p6RddR7r2xFgVRxDksYkjY2Pj7fzpS3j5jLnwDOaLQ9mHJUUEWtTvsYBYGXV8Yqk7HngJEmLkruGqfJGcYwCo1CZx5AyJrMag4ODsxqBNDIyUjOCCTyj2bKnHU1JDwJrkhFIxwEbgO1RmVl3D3BxUm8jcGsb4jGbN89otjxINfNZ0m8Afwf0Ay8COyPiAkmvB66PiA8k9T4AXAcsBG6IiJGk/A1UOqOXAA8Bl0XE4Zle1zOfzczmbrYzn70khplZTnhJDDMzmxcnBjMzq+HEYGZmNZwYzMysRk92PksaB1690/rsLAV+1MRweoHfcz74PWdf2vdbiIj+mSr1ZGJIQ9LYbHrls8TvOR/8nrOvXe/XTUlmZlbDicHMzGrkMTGMdjqADvB7zge/5+xry/vNXR+DmZkdWx7vGMzM7BhylRga7T2dRZJWSrpH0mPJvtyf7nRM7SJpoaSHJH2n07G0g6STJG2T9J+SHpf0q52OqdUk/WHy73qXpJslndDpmJpN0g2SnpO0q6psiaQ7JO1Ofi5uxWvnJjHMsPd0Fh0B/jgiTgPeBXwi4++32qeBxzsdRBv9DfCvEfEW4JfJ+HuXtBz4FFCMiLdRWbV5Q2ejaokbgXXTyjYBd0XEGuCu5LjpcpMYaLD3dIdjapmIOBgRP0ie/w+VD4vMb50qaQXwa8D1nY6lHSS9FngP8FWAiHg5Il7sbFRtsQj4eUmLgD7ghx2Op+ki4vvAoWnF64HNyfPNwEWteO08JYZGe09nnqQB4Azggc5G0hbXAX8KTHY6kDZZDYwDX0uaz66X9JpOB9VKEXEA+CtgH3AQeCkivtfZqNpmWUQcTJ4/AyxrxYvkKTHkkqQTgX8C/iAi/rvT8bSSpF8HnouIHZ2OpY0WAe8AvhwRZwD/S4uaF7pF0q6+nkpSfD3wGkmXdTaq9kt2wWzJsNI8JYZGe09nlqSfo5IUtkTEtzodTxucDVwoqUSlqfAcSTd1NqSW2w/sj4ipu8FtVBJFlq0FnoqI8Yj4KfAt4N0djqldnpV0CkDy87lWvEieEkPdvac7HFPLSBKVdufHI+KvOx1PO0TElRGxIiIGqPz/vTsiMv1NMiKeAZ6W9Oak6FzgsQ6G1A77gHdJ6kv+nZ9Lxjvcq2wHNibPNwK3tuJFFrXil3ajiDgi6Qrgdn629/SjHQ6rlc4Gfht4RNLOpOzPIuK2DsZkrfFJYEvyhWcv8NEOx9NSEfGApG3AD6iMvnuIDM6AlnQz8F5gqaT9wFXANcA3JV1OZYXpS1ry2p75bGZm1fLUlGRmZrPgxGBmZjWcGMzMrIYTg5mZ1XBiMDOzGk4MZmZWw4nBzMxqODGYmVmN/weyH1vd8khynAAAAABJRU5ErkJggg==\n",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {
+ "needs_background": "light"
+ },
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "N = 50\n",
+ "x = np.random.rand(N)\n",
+ "y = np.random.rand(N)\n",
+ "colors = np.random.rand(N)\n",
+ "area = (30 * np.random.rand(N))**2 # 0 to 15 point radii\n",
+ "plt.scatter(x, y, s=area, c=colors, alpha=0.5)\n",
+ "plt.show()\n",
+ "\n",
+ "x = np.linspace(0, 10, 30)\n",
+ "y = np.sin(x)\n",
+ "plt.plot(x, y, 'o', color='black')"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 4,
+ "metadata": {},
+ "outputs": [
+ {
+ "data": {
+ "text/plain": [
+ ""
+ ]
+ },
+ "execution_count": 4,
+ "metadata": {},
+ "output_type": "execute_result"
+ },
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPgAAAD8CAYAAABaQGkdAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAACkBJREFUeJzt3d+rZQUZh/Hn23HKpiypLMwZGi9CkCCNYSCMIKWyEuuiC4WCIpirQikI665/IOoigpisIEvKEiIskzIqKHNmnCxnVGwwnOnHWBGakZP2dnH2wGQTZ53Za+29z8vzgYPnx2afdzM8rnX22We9qSok9fS8ZQ8gaToGLjVm4FJjBi41ZuBSYwYuNWbgUmMGLjVm4FJj50xxp6942Vrt2rltirv+Hw/fv30h30daJf/kKU7W09nodpMEvmvnNn55584p7vp/vP3Vly3k+0ir5J764aDbeYouNWbgUmMGLjVm4FJjBi41ZuBSYwYuNWbgUmODAk9ydZKHkjyS5Kaph5I0jg0DT7IGfA54B3ApcH2SS6ceTNL8hhzB9wCPVNXRqjoJ3Aq8e9qxJI1hSOAXAY+d9vGx2eckrbjRnmRLsjfJ/iT7H//Ls2PdraQ5DAn8OHD6n4btmH3uv1TVF6pqd1XtvuDla2PNJ2kOQwK/F3htkouTPB+4DvjOtGNJGsOGfw9eVc8k+TBwJ7AG3FxVD0w+maS5DbrgQ1XdAdwx8SySRuYr2aTGDFxqzMClxgxcaszApcYMXGrMwKXGDFxqbJLNJg/fv31hG0fu/P2hhXwfcIuKth6P4FJjBi41ZuBSYwYuNWbgUmMGLjVm4FJjBi41ZuBSY0M2m9yc5ESS3yxiIEnjGXIE/zJw9cRzSJrAhoFX1U+Avy5gFkkj82dwqbHR/posyV5gL8C5bB/rbiXNYbQj+Omri7bxgrHuVtIcPEWXGhvya7KvAz8HLklyLMmHph9L0hiG7Ca7fhGDSBqfp+hSYwYuNWbgUmMGLjVm4FJjBi41ZuBSYwYuNTbJ6qJFWuQ6oUWuSQJXJWl+HsGlxgxcaszApcYMXGrMwKXGDFxqzMClxgxcaszApcYMXGpsyEUXdya5O8nhJA8kuWERg0ma35DXoj8DfKyqDiY5DziQ5K6qOjzxbJLmNGQ32R+q6uDs/SeBI8BFUw8maX6b+muyJLuAy4F7zvA1VxdJK2bwk2xJXgx8C7ixqp547tddXSStnkGBJ9nGety3VNW3px1J0liGPIse4IvAkar69PQjSRrLkCP4FcD7gSuTHJq9vXPiuSSNYMhusp8BWcAskkbmK9mkxgxcaszApcYMXGrMwKXGDFxqzMClxgxcamzL7yZbpEXvClvkLjT3oPXkEVxqzMClxgxcaszApcYMXGrMwKXGDFxqzMClxgxcamzIRRfPTfLLJL+arS761CIGkzS/IS9VfRq4sqr+Prt88s+SfK+qfjHxbJLmNOSiiwX8ffbhttlbTTmUpHEMXXywluQQcAK4q6rOuLooyf4k+//F02PPKeksDAq8qp6tqsuAHcCeJK87w21cXSStmE09i15VfwPuBq6eZhxJYxryLPoFSc6fvf9C4K3Ag1MPJml+Q55FvxD4SpI11v+H8I2q+u60Y0kaw5Bn0e9nfSe4pC3GV7JJjRm41JiBS40ZuNSYgUuNGbjUmIFLjRm41Jiri1bYItcJuSapJ4/gUmMGLjVm4FJjBi41ZuBSYwYuNWbgUmMGLjVm4FJjgwOfXRv9viRej03aIjZzBL8BODLVIJLGN3SzyQ7gXcC+aceRNKahR/DPAB8H/j3hLJJGNmTxwTXAiao6sMHt3E0mrZghR/ArgGuTPArcClyZ5KvPvZG7yaTVs2HgVfWJqtpRVbuA64AfVdX7Jp9M0tz8PbjU2Kau6FJVPwZ+PMkkkkbnEVxqzMClxgxcaszApcYMXGrMwKXGDFxqzMClxlxdJMA1SV15BJcaM3CpMQOXGjNwqTEDlxozcKkxA5caM3CpMQOXGhv0SrbZFVWfBJ4Fnqmq3VMOJWkcm3mp6luq6s+TTSJpdJ6iS40NDbyAHyQ5kGTvlANJGs/QU/Q3VdXxJK8E7kryYFX95PQbzMLfC3Au20ceU9LZGHQEr6rjs/+eAG4H9pzhNq4uklbMkOWDL0py3qn3gbcBv5l6MEnzG3KK/irg9iSnbv+1qvr+pFNJGsWGgVfVUeD1C5hF0sj8NZnUmIFLjRm41JiBS40ZuNSYgUuNGbjUmIFLjbm6SAvXdU0SrN6qJI/gUmMGLjVm4FJjBi41ZuBSYwYuNWbgUmMGLjVm4FJjgwJPcn6S25I8mORIkjdOPZik+Q19qepnge9X1XuTPB+88Lm0FWwYeJKXAm8GPgBQVSeBk9OOJWkMQ07RLwYeB76U5L4k+2bXR5e04oYEfg7wBuDzVXU58BRw03NvlGRvkv1J9v+Lp0ceU9LZGBL4MeBYVd0z+/g21oP/L64uklbPhoFX1R+Bx5JcMvvUVcDhSaeSNIqhz6J/BLhl9gz6UeCD040kaSyDAq+qQ8DuiWeRNDJfySY1ZuBSYwYuNWbgUmMGLjVm4FJjBi41ZuBSYwYuNeZuMrW26F1hi9qFtuft/xh0O4/gUmMGLjVm4FJjBi41ZuBSYwYuNWbgUmMGLjVm4FJjGwae5JIkh057eyLJjYsYTtJ8NnypalU9BFwGkGQNOA7cPvFckkaw2VP0q4DfVtXvphhG0rg2+8cm1wFfP9MXkuwF9gKc6/JRaSUMPoLPlh5cC3zzTF93dZG0ejZziv4O4GBV/WmqYSSNazOBX8//OT2XtJoGBT7bB/5W4NvTjiNpTEN3kz0FvHziWSSNzFeySY0ZuNSYgUuNGbjUmIFLjRm41JiBS40ZuNRYqmr8O00eBzb7J6WvAP48+jCroetj83Etz2uq6oKNbjRJ4Gcjyf6q2r3sOabQ9bH5uFafp+hSYwYuNbZKgX9h2QNMqOtj83GtuJX5GVzS+FbpCC5pZCsReJKrkzyU5JEkNy17njEk2Znk7iSHkzyQ5IZlzzSmJGtJ7kvy3WXPMqYk5ye5LcmDSY4keeOyZ5rH0k/RZ9daf5j1K8YcA+4Frq+qw0sdbE5JLgQurKqDSc4DDgDv2eqP65QkHwV2Ay+pqmuWPc9YknwF+GlV7ZtdaHR7Vf1t2XOdrVU4gu8BHqmqo1V1ErgVePeSZ5pbVf2hqg7O3n8SOAJctNypxpFkB/AuYN+yZxlTkpcCbwa+CFBVJ7dy3LAagV8EPHbax8doEsIpSXYBlwP3LHeS0XwG+Djw72UPMrKLgceBL81+/Ng3ux7hlrUKgbeW5MXAt4Abq+qJZc8zryTXACeq6sCyZ5nAOcAbgM9X1eXAU8CWfk5oFQI/Duw87eMds89teUm2sR73LVXV5Yq0VwDXJnmU9R+nrkzy1eWONJpjwLGqOnWmdRvrwW9ZqxD4vcBrk1w8e1LjOuA7S55pbknC+s9yR6rq08ueZyxV9Ymq2lFVu1j/t/pRVb1vyWONoqr+CDyW5JLZp64CtvSTopvdTTa6qnomyYeBO4E14OaqemDJY43hCuD9wK+THJp97pNVdccSZ9LGPgLcMjvYHAU+uOR55rL0X5NJms4qnKJLmoiBS40ZuNSYgUuNGbjUmIFLjRm41JiBS439B+u8ezPSTLfpAAAAAElFTkSuQmCC\n",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {
+ "needs_background": "light"
+ },
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "m = np.eye(8, 8, dtype=np.uint8)\n",
+ "plt.imshow(m)"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "PyCharm (trains-internal)",
+ "language": "python",
+ "name": "pycharm-40126efe"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.5.2"
+ },
+ "pycharm": {
+ "stem_cell": {
+ "cell_type": "raw",
+ "metadata": {
+ "collapsed": false
+ },
+ "source": []
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 1
+}
diff --git a/examples/keras_tensorboard.py b/examples/keras_tensorboard.py
new file mode 100644
index 00000000..32b89b63
--- /dev/null
+++ b/examples/keras_tensorboard.py
@@ -0,0 +1,113 @@
+# TRAINS - Keras with Tensorboard example code, automatic logging model and Tensorboard outputs
+#
+# Train a simple deep NN on the MNIST dataset.
+# Gets to 98.40% test accuracy after 20 epochs
+# (there is *a lot* of margin for parameter tuning).
+# 2 seconds per epoch on a K520 GPU.
+from __future__ import print_function
+
+import numpy as np
+import tensorflow
+
+from keras.callbacks import TensorBoard, ModelCheckpoint
+from keras.datasets import mnist
+from keras.models import Sequential, Model
+from keras.layers.core import Dense, Dropout, Activation
+from keras.optimizers import SGD, Adam, RMSprop
+from keras.utils import np_utils
+from keras.models import load_model, save_model, model_from_json
+
+from trains import Task
+
+
+class TensorBoardImage(TensorBoard):
+ @staticmethod
+ def make_image(tensor):
+ import tensorflow as tf
+ from PIL import Image
+ tensor = np.stack((tensor, tensor, tensor), axis=2)
+ height, width, channels = tensor.shape
+ image = Image.fromarray(tensor)
+ import io
+ output = io.BytesIO()
+ image.save(output, format='PNG')
+ image_string = output.getvalue()
+ output.close()
+ return tf.Summary.Image(height=height,
+ width=width,
+ colorspace=channels,
+ encoded_image_string=image_string)
+
+ def on_epoch_end(self, epoch, logs={}):
+ super().on_epoch_end(epoch, logs)
+ import tensorflow as tf
+ images = self.validation_data[0] # 0 - data; 1 - labels
+ img = (255 * images[0].reshape(28, 28)).astype('uint8')
+
+ image = self.make_image(img)
+ summary = tf.Summary(value=[tf.Summary.Value(tag='image', image=image)])
+ self.writer.add_summary(summary, epoch)
+
+
+batch_size = 128
+nb_classes = 10
+nb_epoch = 6
+
+# the data, shuffled and split between train and test sets
+(X_train, y_train), (X_test, y_test) = mnist.load_data()
+
+X_train = X_train.reshape(60000, 784)
+X_test = X_test.reshape(10000, 784)
+X_train = X_train.astype('float32')
+X_test = X_test.astype('float32')
+X_train /= 255.
+X_test /= 255.
+print(X_train.shape[0], 'train samples')
+print(X_test.shape[0], 'test samples')
+
+# convert class vectors to binary class matrices
+Y_train = np_utils.to_categorical(y_train, nb_classes)
+Y_test = np_utils.to_categorical(y_test, nb_classes)
+
+model = Sequential()
+model.add(Dense(512, input_shape=(784,)))
+model.add(Activation('relu'))
+# model.add(Dropout(0.2))
+model.add(Dense(512))
+model.add(Activation('relu'))
+# model.add(Dropout(0.2))
+model.add(Dense(10))
+model.add(Activation('softmax'))
+
+model2 = Sequential()
+model2.add(Dense(512, input_shape=(784,)))
+model2.add(Activation('relu'))
+
+model.summary()
+
+model.compile(loss='categorical_crossentropy',
+ optimizer=RMSprop(),
+ metrics=['accuracy'])
+
+# Connecting TRAINS
+task = Task.init(project_name='examples', task_name='Keras with TensorBoard example')
+# setting model outputs
+labels = dict(('digit_%d' % i, i) for i in range(10))
+task.set_model_label_enumeration(labels)
+
+board = TensorBoard(histogram_freq=1, log_dir='/tmp/histogram_example', write_images=False)
+model_store = ModelCheckpoint(filepath='/tmp/histogram_example/weight.{epoch}.hdf5')
+
+# load previous model, if it is there
+try:
+ model.load_weights('/tmp/histogram_example/weight.1.hdf5')
+except:
+ pass
+
+history = model.fit(X_train, Y_train,
+ batch_size=batch_size, epochs=nb_epoch,
+ callbacks=[board, model_store],
+ verbose=1, validation_data=(X_test, Y_test))
+score = model.evaluate(X_test, Y_test, verbose=0)
+print('Test score:', score[0])
+print('Test accuracy:', score[1])
diff --git a/examples/manual_model_config.py b/examples/manual_model_config.py
new file mode 100644
index 00000000..7b3434a1
--- /dev/null
+++ b/examples/manual_model_config.py
@@ -0,0 +1,29 @@
+# TRAINS - Example of manual model configuration
+#
+import torch
+from trains import Task
+
+
+task = Task.init(project_name='examples', task_name='Manual model configuration')
+
+# create a model
+model = torch.nn.Module
+
+# store dictionary of definition for a specific network design
+model_config_dict = {
+ 'value': 13.37,
+ 'dict': {'sub_value': 'string'},
+ 'list_of_ints': [1, 2, 3, 4],
+}
+task.set_model_config(config_dict=model_config_dict)
+
+# or read form a config file (this will override the previous configuration dictionary)
+# task.set_model_config(config_text='this is just a blob\nof text from a configuration file')
+
+# store the label enumeration the model is training for
+task.set_model_label_enumeration({'background': 0, 'cat': 1, 'dog': 2})
+print('Any model stored from this point onwards, will contain both model_config and label_enumeration')
+
+# storing the model, it will have the task network configuration and label enumeration
+torch.save(model, '/tmp/model')
+print('Model saved')
diff --git a/examples/manual_reporting.py b/examples/manual_reporting.py
new file mode 100644
index 00000000..148af51e
--- /dev/null
+++ b/examples/manual_reporting.py
@@ -0,0 +1,51 @@
+# TRAINS - Example of manual graphs and statistics reporting
+#
+import numpy as np
+import logging
+from trains import Task
+
+
+task = Task.init(project_name='examples', task_name='Manual reporting')
+
+# example python logger
+logging.getLogger().setLevel('DEBUG')
+logging.debug('This is a debug message')
+logging.info('This is an info message')
+logging.warning('This is a warning message')
+logging.error('This is an error message')
+logging.critical('This is a critical message')
+
+# get TRAINS logger object for any metrics / reports
+logger = task.get_logger()
+
+# log text
+logger.console("hello")
+
+# report scalar values
+logger.report_scalar("example_scalar", "series A", iteration=0, value=100)
+logger.report_scalar("example_scalar", "series A", iteration=1, value=200)
+
+# report histogram
+histogram = np.random.randint(10, size=10)
+logger.report_vector("example_histogram", "random histogram", iteration=1, values=histogram)
+
+# report confusion matrix
+confusion = np.random.randint(10, size=(10, 10))
+logger.report_matrix("example_confusion", "ignored", iteration=1, matrix=confusion)
+
+# report 2d scatter plot
+scatter2d = np.hstack((np.atleast_2d(np.arange(0, 10)).T, np.random.randint(10, size=(10, 1))))
+logger.report_scatter2d("example_scatter", "series_xy", iteration=1, scatter=scatter2d)
+
+# report 3d scatter plot
+scatter3d = np.random.randint(10, size=(10, 3))
+logger.report_scatter3d("example_scatter_3d", "series_xyz", iteration=1, scatter=scatter3d)
+
+# report image
+m = np.eye(256, 256, dtype=np.uint8)*255
+logger.report_image_and_upload("fail cases", "image uint", iteration=1, matrix=m)
+m = np.eye(256, 256, dtype=np.float)
+logger.report_image_and_upload("fail cases", "image float", iteration=1, matrix=m)
+
+# flush reports (otherwise it will be flushed in the background, every couple of seconds)
+logger.flush()
diff --git a/examples/matplotlib_example.py b/examples/matplotlib_example.py
new file mode 100644
index 00000000..f918bed7
--- /dev/null
+++ b/examples/matplotlib_example.py
@@ -0,0 +1,36 @@
+# TRAINS - Example of Matplotlib integration and reporting
+#
+import numpy as np
+import matplotlib.pyplot as plt
+from trains import Task
+
+
+task = Task.init(project_name='examples', task_name='Matplotlib example')
+
+# create plot
+N = 50
+x = np.random.rand(N)
+y = np.random.rand(N)
+colors = np.random.rand(N)
+area = (30 * np.random.rand(N))**2 # 0 to 15 point radii
+plt.scatter(x, y, s=area, c=colors, alpha=0.5)
+plt.show()
+
+# create another plot - with a name
+x = np.linspace(0, 10, 30)
+y = np.sin(x)
+plt.plot(x, y, 'o', color='black')
+plt.show()
+
+# create image plot
+m = np.eye(256, 256, dtype=np.uint8)
+plt.imshow(m)
+plt.show()
+
+# create image plot - with a name
+m = np.eye(256, 256, dtype=np.uint8)
+plt.imshow(m)
+plt.title('Image Title')
+plt.show()
+
+print('This is a Matplotlib example')
diff --git a/examples/pytorch_matplotlib.py b/examples/pytorch_matplotlib.py
new file mode 100644
index 00000000..691f73c6
--- /dev/null
+++ b/examples/pytorch_matplotlib.py
@@ -0,0 +1,479 @@
+# TRAINS - Example of Pytorch and matplotlib integration and reporting
+#
+"""
+Neural Transfer Using PyTorch
+=============================
+**Author**: `Alexis Jacq `_
+
+**Edited by**: `Winston Herring `_
+Introduction
+------------
+This tutorial explains how to implement the `Neural-Style algorithm `__
+developed by Leon A. Gatys, Alexander S. Ecker and Matthias Bethge.
+Neural-Style, or Neural-Transfer, allows you to take an image and
+reproduce it with a new artistic style. The algorithm takes three images,
+an input image, a content-image, and a style-image, and changes the input
+to resemble the content of the content-image and the artistic style of the style-image.
+
+.. figure:: /_static/img/neural-style/neuralstyle.png
+ :alt: content1
+"""
+
+######################################################################
+# Underlying Principle
+# --------------------
+#
+# The principle is simple: we define two distances, one for the content
+# (:math:`D_C`) and one for the style (:math:`D_S`). :math:`D_C` measures how different the content
+# is between two images while :math:`D_S` measures how different the style is
+# between two images. Then, we take a third image, the input, and
+# transform it to minimize both its content-distance with the
+# content-image and its style-distance with the style-image. Now we can
+# import the necessary packages and begin the neural transfer.
+#
+# Importing Packages and Selecting a Device
+# -----------------------------------------
+# Below is a list of the packages needed to implement the neural transfer.
+#
+# - ``torch``, ``torch.nn``, ``numpy`` (indispensables packages for
+# neural networks with PyTorch)
+# - ``torch.optim`` (efficient gradient descents)
+# - ``PIL``, ``PIL.Image``, ``matplotlib.pyplot`` (load and display
+# images)
+# - ``torchvision.transforms`` (transform PIL images into tensors)
+# - ``torchvision.models`` (train or load pre-trained models)
+# - ``copy`` (to deep copy the models; system package)
+
+from __future__ import print_function
+import torch
+import torch.nn as nn
+import torch.nn.functional as F
+import torch.optim as optim
+
+from PIL import Image
+import matplotlib.pyplot as plt
+
+import torchvision.transforms as transforms
+import torchvision.models as models
+
+import copy
+from trains import Task
+
+
+task = Task.init(project_name='examples', task_name='pytorch with matplotlib example', task_type=Task.TaskTypes.testing)
+
+
+######################################################################
+# Next, we need to choose which device to run the network on and import the
+# content and style images. Running the neural transfer algorithm on large
+# images takes longer and will go much faster when running on a GPU. We can
+# use ``torch.cuda.is_available()`` to detect if there is a GPU available.
+# Next, we set the ``torch.device`` for use throughout the tutorial. Also the ``.to(device)``
+# method is used to move tensors or modules to a desired device.
+
+device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
+
+######################################################################
+# Loading the Images
+# ------------------
+#
+# Now we will import the style and content images. The original PIL images have values between 0 and 255, but when
+# transformed into torch tensors, their values are converted to be between
+# 0 and 1. The images also need to be resized to have the same dimensions.
+# An important detail to note is that neural networks from the
+# torch library are trained with tensor values ranging from 0 to 1. If you
+# try to feed the networks with 0 to 255 tensor images, then the activated
+# feature maps will be unable sense the intended content and style.
+# However, pre-trained networks from the Caffe library are trained with 0
+# to 255 tensor images.
+#
+#
+# .. Note::
+# Here are links to download the images required to run the tutorial:
+# `picasso.jpg `__ and
+# `dancing.jpg `__.
+# Download these two images and add them to a directory
+# with name ``images`` in your current working directory.
+
+# desired size of the output image
+imsize = 512 if torch.cuda.is_available() else 128 # use small size if no gpu
+
+loader = transforms.Compose([
+ transforms.Resize(imsize), # scale imported image
+ transforms.ToTensor()]) # transform it into a torch tensor
+
+
+def image_loader(image_name):
+ image = Image.open(image_name)
+ # fake batch dimension required to fit network's input dimensions
+ image = loader(image).unsqueeze(0)
+ return image.to(device, torch.float)
+
+
+style_img = image_loader("./samples/picasso.jpg")
+content_img = image_loader("./samples/dancing.jpg")
+
+assert style_img.size() == content_img.size(), \
+ "we need to import style and content images of the same size"
+
+######################################################################
+# Now, let's create a function that displays an image by reconverting a
+# copy of it to PIL format and displaying the copy using
+# ``plt.imshow``. We will try displaying the content and style images
+# to ensure they were imported correctly.
+
+unloader = transforms.ToPILImage() # reconvert into PIL image
+
+plt.ion()
+
+
+def imshow(tensor, title=None):
+ image = tensor.cpu().clone() # we clone the tensor to not do changes on it
+ image = image.squeeze(0) # remove the fake batch dimension
+ image = unloader(image)
+ plt.imshow(image)
+ if title is not None:
+ plt.title(title)
+ plt.pause(0.001) # pause a bit so that plots are updated
+
+
+plt.figure()
+imshow(style_img, title='Style Image')
+
+plt.figure()
+imshow(content_img, title='Content Image')
+
+
+######################################################################
+# Loss Functions
+# --------------
+# Content Loss
+# ~~~~~~~~~~~~
+#
+# The content loss is a function that represents a weighted version of the
+# content distance for an individual layer. The function takes the feature
+# maps :math:`F_{XL}` of a layer :math:`L` in a network processing input :math:`X` and returns the
+# weighted content distance :math:`w_{CL}.D_C^L(X,C)` between the image :math:`X` and the
+# content image :math:`C`. The feature maps of the content image(:math:`F_{CL}`) must be
+# known by the function in order to calculate the content distance. We
+# implement this function as a torch module with a constructor that takes
+# :math:`F_{CL}` as an input. The distance :math:`\|F_{XL} - F_{CL}\|^2` is the mean square error
+# between the two sets of feature maps, and can be computed using ``nn.MSELoss``.
+#
+# We will add this content loss module directly after the convolution
+# layer(s) that are being used to compute the content distance. This way
+# each time the network is fed an input image the content losses will be
+# computed at the desired layers and because of auto grad, all the
+# gradients will be computed. Now, in order to make the content loss layer
+# transparent we must define a ``forward`` method that computes the content
+# loss and then returns the layer’s input. The computed loss is saved as a
+# parameter of the module.
+#
+
+class ContentLoss(nn.Module):
+
+ def __init__(self, target, ):
+ super(ContentLoss, self).__init__()
+ # we 'detach' the target content from the tree used
+ # to dynamically compute the gradient: this is a stated value,
+ # not a variable. Otherwise the forward method of the criterion
+ # will throw an error.
+ self.target = target.detach()
+
+ def forward(self, input):
+ self.loss = F.mse_loss(input, self.target)
+ return input
+
+
+######################################################################
+# .. Note::
+# **Important detail**: although this module is named ``ContentLoss``, it
+# is not a true PyTorch Loss function. If you want to define your content
+# loss as a PyTorch Loss function, you have to create a PyTorch autograd function
+# to recompute/implement the gradient manually in the ``backward``
+# method.
+
+######################################################################
+# Style Loss
+# ~~~~~~~~~~
+#
+# The style loss module is implemented similarly to the content loss
+# module. It will act as a transparent layer in a
+# network that computes the style loss of that layer. In order to
+# calculate the style loss, we need to compute the gram matrix :math:`G_{XL}`. A gram
+# matrix is the result of multiplying a given matrix by its transposed
+# matrix. In this application the given matrix is a reshaped version of
+# the feature maps :math:`F_{XL}` of a layer :math:`L`. :math:`F_{XL}` is reshaped to form :math:`\hat{F}_{XL}`, a :math:`K`\ x\ :math:`N`
+# matrix, where :math:`K` is the number of feature maps at layer :math:`L` and :math:`N` is the
+# length of any vectorized feature map :math:`F_{XL}^k`. For example, the first line
+# of :math:`\hat{F}_{XL}` corresponds to the first vectorized feature map :math:`F_{XL}^1`.
+#
+# Finally, the gram matrix must be normalized by dividing each element by
+# the total number of elements in the matrix. This normalization is to
+# counteract the fact that :math:`\hat{F}_{XL}` matrices with a large :math:`N` dimension yield
+# larger values in the Gram matrix. These larger values will cause the
+# first layers (before pooling layers) to have a larger impact during the
+# gradient descent. Style features tend to be in the deeper layers of the
+# network so this normalization step is crucial.
+#
+
+def gram_matrix(input):
+ a, b, c, d = input.size() # a=batch size(=1)
+ # b=number of feature maps
+ # (c,d)=dimensions of a f. map (N=c*d)
+
+ features = input.view(a * b, c * d) # resise F_XL into \hat F_XL
+
+ G = torch.mm(features, features.t()) # compute the gram product
+
+ # we 'normalize' the values of the gram matrix
+ # by dividing by the number of element in each feature maps.
+ return G.div(a * b * c * d)
+
+
+######################################################################
+# Now the style loss module looks almost exactly like the content loss
+# module. The style distance is also computed using the mean square
+# error between :math:`G_{XL}` and :math:`G_{SL}`.
+#
+
+class StyleLoss(nn.Module):
+
+ def __init__(self, target_feature):
+ super(StyleLoss, self).__init__()
+ self.target = gram_matrix(target_feature).detach()
+
+ def forward(self, input):
+ G = gram_matrix(input)
+ self.loss = F.mse_loss(G, self.target)
+ return input
+
+
+######################################################################
+# Importing the Model
+# -------------------
+#
+# Now we need to import a pre-trained neural network. We will use a 19
+# layer VGG network like the one used in the paper.
+#
+# PyTorch’s implementation of VGG is a module divided into two child
+# ``Sequential`` modules: ``features`` (containing convolution and pooling layers),
+# and ``classifier`` (containing fully connected layers). We will use the
+# ``features`` module because we need the output of the individual
+# convolution layers to measure content and style loss. Some layers have
+# different behavior during training than evaluation, so we must set the
+# network to evaluation mode using ``.eval()``.
+#
+
+cnn = models.vgg19(pretrained=True).features.to(device).eval()
+
+######################################################################
+# Additionally, VGG networks are trained on images with each channel
+# normalized by mean=[0.485, 0.456, 0.406] and std=[0.229, 0.224, 0.225].
+# We will use them to normalize the image before sending it into the network.
+#
+
+cnn_normalization_mean = torch.tensor([0.485, 0.456, 0.406]).to(device)
+cnn_normalization_std = torch.tensor([0.229, 0.224, 0.225]).to(device)
+
+
+# create a module to normalize input image so we can easily put it in a
+# nn.Sequential
+class Normalization(nn.Module):
+ def __init__(self, mean, std):
+ super(Normalization, self).__init__()
+ # .view the mean and std to make them [C x 1 x 1] so that they can
+ # directly work with image Tensor of shape [B x C x H x W].
+ # B is batch size. C is number of channels. H is height and W is width.
+ self.mean = torch.tensor(mean).view(-1, 1, 1)
+ self.std = torch.tensor(std).view(-1, 1, 1)
+
+ def forward(self, img):
+ # normalize img
+ return (img - self.mean) / self.std
+
+
+######################################################################
+# A ``Sequential`` module contains an ordered list of child modules. For
+# instance, ``vgg19.features`` contains a sequence (Conv2d, ReLU, MaxPool2d,
+# Conv2d, ReLU…) aligned in the right order of depth. We need to add our
+# content loss and style loss layers immediately after the convolution
+# layer they are detecting. To do this we must create a new ``Sequential``
+# module that has content loss and style loss modules correctly inserted.
+#
+
+# desired depth layers to compute style/content losses :
+content_layers_default = ['conv_4']
+style_layers_default = ['conv_1', 'conv_2', 'conv_3', 'conv_4', 'conv_5']
+
+
+def get_style_model_and_losses(cnn, normalization_mean, normalization_std,
+ style_img, content_img,
+ content_layers=content_layers_default,
+ style_layers=style_layers_default):
+ cnn = copy.deepcopy(cnn)
+
+ # normalization module
+ normalization = Normalization(normalization_mean, normalization_std).to(device)
+
+ # just in order to have an iterable access to or list of content/syle
+ # losses
+ content_losses = []
+ style_losses = []
+
+ # assuming that cnn is a nn.Sequential, so we make a new nn.Sequential
+ # to put in modules that are supposed to be activated sequentially
+ model = nn.Sequential(normalization)
+
+ i = 0 # increment every time we see a conv
+ for layer in cnn.children():
+ if isinstance(layer, nn.Conv2d):
+ i += 1
+ name = 'conv_{}'.format(i)
+ elif isinstance(layer, nn.ReLU):
+ name = 'relu_{}'.format(i)
+ # The in-place version doesn't play very nicely with the ContentLoss
+ # and StyleLoss we insert below. So we replace with out-of-place
+ # ones here.
+ layer = nn.ReLU(inplace=False)
+ elif isinstance(layer, nn.MaxPool2d):
+ name = 'pool_{}'.format(i)
+ elif isinstance(layer, nn.BatchNorm2d):
+ name = 'bn_{}'.format(i)
+ else:
+ raise RuntimeError('Unrecognized layer: {}'.format(layer.__class__.__name__))
+
+ model.add_module(name, layer)
+
+ if name in content_layers:
+ # add content loss:
+ target = model(content_img).detach()
+ content_loss = ContentLoss(target)
+ model.add_module("content_loss_{}".format(i), content_loss)
+ content_losses.append(content_loss)
+
+ if name in style_layers:
+ # add style loss:
+ target_feature = model(style_img).detach()
+ style_loss = StyleLoss(target_feature)
+ model.add_module("style_loss_{}".format(i), style_loss)
+ style_losses.append(style_loss)
+
+ # now we trim off the layers after the last content and style losses
+ for i in range(len(model) - 1, -1, -1):
+ if isinstance(model[i], ContentLoss) or isinstance(model[i], StyleLoss):
+ break
+
+ model = model[:(i + 1)]
+
+ return model, style_losses, content_losses
+
+
+######################################################################
+# Next, we select the input image. You can use a copy of the content image
+# or white noise.
+#
+
+input_img = content_img.clone()
+# if you want to use white noise instead uncomment the below line:
+# input_img = torch.randn(content_img.data.size(), device=device)
+
+# add the original input image to the figure:
+plt.figure()
+imshow(input_img, title='Input Image')
+
+
+######################################################################
+# Gradient Descent
+# ----------------
+#
+# As Leon Gatys, the author of the algorithm, suggested `here `__, we will use
+# L-BFGS algorithm to run our gradient descent. Unlike training a network,
+# we want to train the input image in order to minimise the content/style
+# losses. We will create a PyTorch L-BFGS optimizer ``optim.LBFGS`` and pass
+# our image to it as the tensor to optimize.
+#
+
+def get_input_optimizer(input_img):
+ # this line to show that input is a parameter that requires a gradient
+ optimizer = optim.LBFGS([input_img.requires_grad_()])
+ return optimizer
+
+
+######################################################################
+# Finally, we must define a function that performs the neural transfer. For
+# each iteration of the networks, it is fed an updated input and computes
+# new losses. We will run the ``backward`` methods of each loss module to
+# dynamicaly compute their gradients. The optimizer requires a “closure”
+# function, which reevaluates the modul and returns the loss.
+#
+# We still have one final constraint to address. The network may try to
+# optimize the input with values that exceed the 0 to 1 tensor range for
+# the image. We can address this by correcting the input values to be
+# between 0 to 1 each time the network is run.
+#
+
+def run_style_transfer(cnn, normalization_mean, normalization_std,
+ content_img, style_img, input_img, num_steps=300,
+ style_weight=1000000, content_weight=1):
+ """Run the style transfer."""
+ print('Building the style transfer model..')
+ model, style_losses, content_losses = get_style_model_and_losses(cnn,
+ normalization_mean, normalization_std, style_img,
+ content_img)
+ optimizer = get_input_optimizer(input_img)
+
+ print('Optimizing..')
+ run = [0]
+ while run[0] <= num_steps:
+
+ def closure():
+ # correct the values of updated input image
+ input_img.data.clamp_(0, 1)
+
+ optimizer.zero_grad()
+ model(input_img)
+ style_score = 0
+ content_score = 0
+
+ for sl in style_losses:
+ style_score += sl.loss
+ for cl in content_losses:
+ content_score += cl.loss
+
+ style_score *= style_weight
+ content_score *= content_weight
+
+ loss = style_score + content_score
+ loss.backward()
+
+ run[0] += 1
+ if run[0] % 50 == 0:
+ print("run {}:".format(run))
+ print('Style Loss : {:4f} Content Loss: {:4f}'.format(
+ style_score.item(), content_score.item()))
+ print()
+
+ return style_score + content_score
+
+ optimizer.step(closure)
+
+ # a last correction...
+ input_img.data.clamp_(0, 1)
+
+ return input_img
+
+
+######################################################################
+# Finally, we can run the algorithm.
+#
+
+output = run_style_transfer(cnn, cnn_normalization_mean, cnn_normalization_std,
+ content_img, style_img, input_img)
+
+plt.figure()
+imshow(output, title='Output Image')
+
+# sphinx_gallery_thumbnail_number = 4
+plt.ioff()
+plt.show()
diff --git a/examples/pytorch_mnist.py b/examples/pytorch_mnist.py
new file mode 100644
index 00000000..81b537ee
--- /dev/null
+++ b/examples/pytorch_mnist.py
@@ -0,0 +1,124 @@
+# TRAINS - Example of Pytorch mnist training integration
+#
+from __future__ import print_function
+import argparse
+import torch
+import torch.nn as nn
+import torch.nn.functional as F
+import torch.optim as optim
+from torchvision import datasets, transforms
+
+from trains import Task
+task = Task.init(project_name='examples', task_name='pytorch mnist train')
+
+
+class Net(nn.Module):
+ def __init__(self):
+ super(Net, self).__init__()
+ self.conv1 = nn.Conv2d(1, 20, 5, 1)
+ self.conv2 = nn.Conv2d(20, 50, 5, 1)
+ self.fc1 = nn.Linear(4 * 4 * 50, 500)
+ self.fc2 = nn.Linear(500, 10)
+
+ def forward(self, x):
+ x = F.relu(self.conv1(x))
+ x = F.max_pool2d(x, 2, 2)
+ x = F.relu(self.conv2(x))
+ x = F.max_pool2d(x, 2, 2)
+ x = x.view(-1, 4 * 4 * 50)
+ x = F.relu(self.fc1(x))
+ x = self.fc2(x)
+ return F.log_softmax(x, dim=1)
+
+
+def train(args, model, device, train_loader, optimizer, epoch):
+ model.train()
+ for batch_idx, (data, target) in enumerate(train_loader):
+ data, target = data.to(device), target.to(device)
+ optimizer.zero_grad()
+ output = model(data)
+ loss = F.nll_loss(output, target)
+ loss.backward()
+ optimizer.step()
+ if batch_idx % args.log_interval == 0:
+ print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
+ epoch, batch_idx * len(data), len(train_loader.dataset),
+ 100. * batch_idx / len(train_loader), loss.item()))
+
+
+def test(args, model, device, test_loader):
+ model.eval()
+ test_loss = 0
+ correct = 0
+ with torch.no_grad():
+ for data, target in test_loader:
+ data, target = data.to(device), target.to(device)
+ output = model(data)
+ test_loss += F.nll_loss(output, target, reduction='sum').item() # sum up batch loss
+ pred = output.argmax(dim=1, keepdim=True) # get the index of the max log-probability
+ correct += pred.eq(target.view_as(pred)).sum().item()
+
+ test_loss /= len(test_loader.dataset)
+
+ print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(
+ test_loss, correct, len(test_loader.dataset),
+ 100. * correct / len(test_loader.dataset)))
+
+
+def main():
+ # Training settings
+ parser = argparse.ArgumentParser(description='PyTorch MNIST Example')
+ parser.add_argument('--batch-size', type=int, default=64, metavar='N',
+ help='input batch size for training (default: 64)')
+ parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N',
+ help='input batch size for testing (default: 1000)')
+ parser.add_argument('--epochs', type=int, default=10, metavar='N',
+ help='number of epochs to train (default: 10)')
+ parser.add_argument('--lr', type=float, default=0.01, metavar='LR',
+ help='learning rate (default: 0.01)')
+ parser.add_argument('--momentum', type=float, default=0.5, metavar='M',
+ help='SGD momentum (default: 0.5)')
+ parser.add_argument('--no-cuda', action='store_true', default=False,
+ help='disables CUDA training')
+ parser.add_argument('--seed', type=int, default=1, metavar='S',
+ help='random seed (default: 1)')
+ parser.add_argument('--log-interval', type=int, default=10, metavar='N',
+ help='how many batches to wait before logging training status')
+
+ parser.add_argument('--save-model', action='store_true', default=True,
+ help='For Saving the current Model')
+ args = parser.parse_args()
+ use_cuda = not args.no_cuda and torch.cuda.is_available()
+
+ torch.manual_seed(args.seed)
+
+ device = torch.device("cuda" if use_cuda else "cpu")
+
+ kwargs = {'num_workers': 4, 'pin_memory': True} if use_cuda else {}
+ train_loader = torch.utils.data.DataLoader(
+ datasets.MNIST('../data', train=True, download=True,
+ transform=transforms.Compose([
+ transforms.ToTensor(),
+ transforms.Normalize((0.1307,), (0.3081,))
+ ])),
+ batch_size=args.batch_size, shuffle=True, **kwargs)
+ test_loader = torch.utils.data.DataLoader(
+ datasets.MNIST('../data', train=False, transform=transforms.Compose([
+ transforms.ToTensor(),
+ transforms.Normalize((0.1307,), (0.3081,))
+ ])),
+ batch_size=args.test_batch_size, shuffle=True, **kwargs)
+
+ model = Net().to(device)
+ optimizer = optim.SGD(model.parameters(), lr=args.lr, momentum=args.momentum)
+
+ for epoch in range(1, args.epochs + 1):
+ train(args, model, device, train_loader, optimizer, epoch)
+ test(args, model, device, test_loader)
+
+ if (args.save_model):
+ torch.save(model.state_dict(), "/tmp/mnist_cnn.pt")
+
+
+if __name__ == '__main__':
+ main()
diff --git a/examples/pytorch_tensorboard.py b/examples/pytorch_tensorboard.py
new file mode 100644
index 00000000..eebf560b
--- /dev/null
+++ b/examples/pytorch_tensorboard.py
@@ -0,0 +1,126 @@
+# TRAINS - Example of pytorch with tensorboard>=v1.14
+#
+from __future__ import print_function
+
+import argparse
+import torch
+import torch.nn as nn
+import torch.nn.functional as F
+import torch.optim as optim
+from torchvision import datasets, transforms
+from torch.autograd import Variable
+from torch.utils.tensorboard import SummaryWriter
+
+from trains import Task
+task = Task.init(project_name='examples', task_name='pytroch with tensorboard')
+
+
+writer = SummaryWriter('runs')
+writer.add_text('lstm', 'This is an lstm', 0)
+# Training settings
+parser = argparse.ArgumentParser(description='PyTorch MNIST Example')
+parser.add_argument('--batch-size', type=int, default=64, metavar='N',
+ help='input batch size for training (default: 64)')
+parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N',
+ help='input batch size for testing (default: 1000)')
+parser.add_argument('--epochs', type=int, default=2, metavar='N',
+ help='number of epochs to train (default: 10)')
+parser.add_argument('--lr', type=float, default=0.01, metavar='LR',
+ help='learning rate (default: 0.01)')
+parser.add_argument('--momentum', type=float, default=0.5, metavar='M',
+ help='SGD momentum (default: 0.5)')
+parser.add_argument('--no-cuda', action='store_true', default=False,
+ help='disables CUDA training')
+parser.add_argument('--seed', type=int, default=1, metavar='S',
+ help='random seed (default: 1)')
+parser.add_argument('--log-interval', type=int, default=10, metavar='N',
+ help='how many batches to wait before logging training status')
+args = parser.parse_args()
+args.cuda = not args.no_cuda and torch.cuda.is_available()
+
+torch.manual_seed(args.seed)
+if args.cuda:
+ torch.cuda.manual_seed(args.seed)
+
+kwargs = {'num_workers': 4, 'pin_memory': True} if args.cuda else {}
+train_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=True, download=True,
+ transform=transforms.Compose([
+ transforms.ToTensor(),
+ transforms.Normalize((0.1307,), (0.3081,))])),
+ batch_size=args.batch_size, shuffle=True, **kwargs)
+test_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=False,
+ transform=transforms.Compose([
+ transforms.ToTensor(),
+ transforms.Normalize((0.1307,), (0.3081,))])),
+ batch_size=args.batch_size, shuffle=True, **kwargs)
+
+
+class Net(nn.Module):
+ def __init__(self):
+ super(Net, self).__init__()
+ self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
+ self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
+ self.conv2_drop = nn.Dropout2d()
+ self.fc1 = nn.Linear(320, 50)
+ self.fc2 = nn.Linear(50, 10)
+
+ def forward(self, x):
+ x = F.relu(F.max_pool2d(self.conv1(x), 2))
+ x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
+ x = x.view(-1, 320)
+ x = F.relu(self.fc1(x))
+ x = F.dropout(x, training=self.training)
+ x = self.fc2(x)
+ return F.log_softmax(x)
+
+
+model = Net()
+if args.cuda:
+ model.cuda()
+optimizer = optim.SGD(model.parameters(), lr=args.lr, momentum=args.momentum)
+
+
+def train(epoch):
+ model.train()
+ for batch_idx, (data, target) in enumerate(train_loader):
+ if args.cuda:
+ data, target = data.cuda(), target.cuda()
+ data, target = Variable(data), Variable(target)
+ optimizer.zero_grad()
+ output = model(data)
+ loss = F.nll_loss(output, target)
+ loss.backward()
+ optimizer.step()
+ if batch_idx % args.log_interval == 0:
+ print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
+ epoch, batch_idx * len(data), len(train_loader.dataset),
+ 100. * batch_idx / len(train_loader), loss.data.item()))
+ niter = epoch*len(train_loader)+batch_idx
+ writer.add_scalar('Train/Loss', loss.data.item(), niter)
+
+
+def test():
+ model.eval()
+ test_loss = 0
+ correct = 0
+ for niter, (data, target) in enumerate(test_loader):
+ if args.cuda:
+ data, target = data.cuda(), target.cuda()
+ data, target = Variable(data, volatile=True), Variable(target)
+ output = model(data)
+ test_loss += F.nll_loss(output, target, size_average=False).data.item() # sum up batch loss
+ pred = output.data.max(1)[1] # get the index of the max log-probability
+ pred = pred.eq(target.data).cpu().sum()
+ writer.add_scalar('Test/Loss', pred, niter)
+ correct += pred
+
+ test_loss /= len(test_loader.dataset)
+ print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(
+ test_loss, correct, len(test_loader.dataset),
+ 100. * correct / len(test_loader.dataset)))
+
+
+for epoch in range(1, args.epochs + 1):
+ train(epoch)
+ torch.save(model, '/tmp/model{}'.format(epoch))
+test()
diff --git a/examples/pytorch_tensorboardX.py b/examples/pytorch_tensorboardX.py
new file mode 100644
index 00000000..859a8bd0
--- /dev/null
+++ b/examples/pytorch_tensorboardX.py
@@ -0,0 +1,126 @@
+# TRAINS - Example of pytorch with tensorboardX
+#
+from __future__ import print_function
+
+import argparse
+import torch
+import torch.nn as nn
+import torch.nn.functional as F
+import torch.optim as optim
+from torchvision import datasets, transforms
+from torch.autograd import Variable
+from tensorboardX import SummaryWriter
+
+from trains import Task
+task = Task.init(project_name='examples', task_name='pytroch with tensorboardX')
+
+
+writer = SummaryWriter('runs')
+writer.add_text('lstm', 'This is an lstm', 0)
+# Training settings
+parser = argparse.ArgumentParser(description='PyTorch MNIST Example')
+parser.add_argument('--batch-size', type=int, default=64, metavar='N',
+ help='input batch size for training (default: 64)')
+parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N',
+ help='input batch size for testing (default: 1000)')
+parser.add_argument('--epochs', type=int, default=2, metavar='N',
+ help='number of epochs to train (default: 10)')
+parser.add_argument('--lr', type=float, default=0.01, metavar='LR',
+ help='learning rate (default: 0.01)')
+parser.add_argument('--momentum', type=float, default=0.5, metavar='M',
+ help='SGD momentum (default: 0.5)')
+parser.add_argument('--no-cuda', action='store_true', default=False,
+ help='disables CUDA training')
+parser.add_argument('--seed', type=int, default=1, metavar='S',
+ help='random seed (default: 1)')
+parser.add_argument('--log-interval', type=int, default=10, metavar='N',
+ help='how many batches to wait before logging training status')
+args = parser.parse_args()
+args.cuda = not args.no_cuda and torch.cuda.is_available()
+
+torch.manual_seed(args.seed)
+if args.cuda:
+ torch.cuda.manual_seed(args.seed)
+
+kwargs = {'num_workers': 4, 'pin_memory': True} if args.cuda else {}
+train_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=True, download=True,
+ transform=transforms.Compose([
+ transforms.ToTensor(),
+ transforms.Normalize((0.1307,), (0.3081,))])),
+ batch_size=args.batch_size, shuffle=True, **kwargs)
+test_loader = torch.utils.data.DataLoader(datasets.MNIST('../data', train=False,
+ transform=transforms.Compose([
+ transforms.ToTensor(),
+ transforms.Normalize((0.1307,), (0.3081,))])),
+ batch_size=args.batch_size, shuffle=True, **kwargs)
+
+
+class Net(nn.Module):
+ def __init__(self):
+ super(Net, self).__init__()
+ self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
+ self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
+ self.conv2_drop = nn.Dropout2d()
+ self.fc1 = nn.Linear(320, 50)
+ self.fc2 = nn.Linear(50, 10)
+
+ def forward(self, x):
+ x = F.relu(F.max_pool2d(self.conv1(x), 2))
+ x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
+ x = x.view(-1, 320)
+ x = F.relu(self.fc1(x))
+ x = F.dropout(x, training=self.training)
+ x = self.fc2(x)
+ return F.log_softmax(x)
+
+
+model = Net()
+if args.cuda:
+ model.cuda()
+optimizer = optim.SGD(model.parameters(), lr=args.lr, momentum=args.momentum)
+
+
+def train(epoch):
+ model.train()
+ for batch_idx, (data, target) in enumerate(train_loader):
+ if args.cuda:
+ data, target = data.cuda(), target.cuda()
+ data, target = Variable(data), Variable(target)
+ optimizer.zero_grad()
+ output = model(data)
+ loss = F.nll_loss(output, target)
+ loss.backward()
+ optimizer.step()
+ if batch_idx % args.log_interval == 0:
+ print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
+ epoch, batch_idx * len(data), len(train_loader.dataset),
+ 100. * batch_idx / len(train_loader), loss.data.item()))
+ niter = epoch*len(train_loader)+batch_idx
+ writer.add_scalar('Train/Loss', loss.data.item(), niter)
+
+
+def test():
+ model.eval()
+ test_loss = 0
+ correct = 0
+ for niter, (data, target) in enumerate(test_loader):
+ if args.cuda:
+ data, target = data.cuda(), target.cuda()
+ data, target = Variable(data, volatile=True), Variable(target)
+ output = model(data)
+ test_loss += F.nll_loss(output, target, size_average=False).data.item() # sum up batch loss
+ pred = output.data.max(1)[1] # get the index of the max log-probability
+ pred = pred.eq(target.data).cpu().sum()
+ writer.add_scalar('Test/Loss', pred, niter)
+ correct += pred
+
+ test_loss /= len(test_loader.dataset)
+ print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(
+ test_loss, correct, len(test_loader.dataset),
+ 100. * correct / len(test_loader.dataset)))
+
+
+for epoch in range(1, args.epochs + 1):
+ train(epoch)
+ torch.save(model, '/tmp/model{}'.format(epoch))
+test()
diff --git a/examples/samples/dancing.jpg b/examples/samples/dancing.jpg
new file mode 100644
index 0000000000000000000000000000000000000000..4bb9da7e26a3702c7da783540c84c813ea743a10
GIT binary patch
literal 40484
zcmeFYbyQr<*DlzNYjAIz;MTaiySqD$y96gdAUHGu0t5mCcXtTx1oz;C;O=mF-}&zS
zW@gQ-H8cOtulk%`)wOr+Q_s`u)Y*Gio!7^wbP1=!e}Jy^}HTr8|vEnS@0e9c|iz^v?S03lId
zS941TYflObYg>C~VXD)%E-DIpD`6^KZe@05S4nF-dpUo1YfXO@ElYm~OMWXVQ4tCu
zUjbhyS0`&va|&N4M`sTKUty|$jSIZh|7m8UqWD+E(?OU@Pg#vZ(#74Hf{T@lm7V1c
zxHkvYn+8vzX|ng6oz_Dh)RzlQ4L6%LeA>=VxQ*VB_FmdDCF=@N@Pw
z_hoVRp#Gm0q^vzG-R)gH?OmKH{;_Co;o{{fO!apCe+Op$M)Uu&`+uO|AIZNQ{7+R+
zdz=3o)qf=arTR}V0X2JHYezjPdnapWkGJazQ*m)|QBnN2NdNyV`p1BPypy@DwXVID
zr`>;b`0rLZYkOO}w^9FV=ig>kcYEtMsQL!@z$S_zwgB|6}04JveLUH@D5_%|Clx0Z0O1VPO6_-U{5?0gnU^4+jU2f{2KK
zgoc8KhKhoUivA886a5|5J5*Fmd`v7HTs%BHGz!^0!v
zqNAeY{=Y7-F90D)4QX*LRSf|OTPqfbtF?mVaE_e+m%h
z4Iu&|5;6+vn?eI700syP3j+uH59T*%;F}x(hXs#K2^L4dQ8!1Va>L~aP0B-}mZ<5$
z)A)5p!)f6jhKzzwKuARVo|cZDfsu=whnJ6EKvGItMpjNKYRQ72L^|R
zN2X_H=jIm{mzFoTws&^-_WvCGJ-@iTy1u!+yMOox7Z3pZzrlL@{=dY9^@a-u4h|L$
z@gH137@vQDW5K~wf)TLA)e+6zaHu#!k#HrF@@jgJsW~;y@GRVaq2SYSZN5MM2im{L
z{+|I0`~QUOe*yb%T+0A7Sm4{A2a5#|1>C7&S|u4b9tpWACJ6e!0$iVZ7uJ)8y-`+_
zqKeDtrF>BK5y=^D@rFF{DkGgy5Qe2qM84Doa_O`W$#UziNOKnQ*T>$fnebCamQKcB9%d!dKWKCK({__er
zBW^)KHrGJ{nsiE;;@l
zPdG&cPB4LZ`VxLr6FVA=op<+HL~FYf>Pkf*e1=3clwlKf50f6+YI~80dlK?dmCE7E
z=%9xtw8c<8h9XH`6vGx7Xkx0Bs~Ap~f0Q;6*uE=?$J6%lp9`UYVc8E?A0(E(WQ??Y
z{_9xL3}p$;4D9Tsu=vdw1E_IgVr+&bkI>G}4#(zLi_mLyyZNs0R56|Gn*sX;V3{}0
z6qE=SKRd1#}U=kI^+ZZPZOG@9WW@)7!?Bej|M)7p&W&!c?b@a9Y9(;siB%y;k|C
zot?{ZlO&kFSeaF}A>qH3U!9YNnax9y#veS(>HumfVd6uB5I~{x*F}NYv>_Krn8POwzn*SB(k~-D#c4K2f8pQf_JN@n!>>6JPZN5kSG4A}McjoA)i9Q9Tn8n0
z(NSgC;~>jD?!3$I&<-haBmY8S!M@)hY_WvQUmlTr|xZ
zVS6C_?skt=kb2s|u|B?UvL}t#VZTw^_%Fd1X*^szYBSD;HaDGgg3Z(#Yip*=D4z#K
z+k6&1C~uhGdN4{EvtvAFyx2e>q!M1<7Ou1BBCl*HB!aSbaRNN9nTDX>wpbZtI#sO`ua)}^0SiN7N5D(>D
zQRr{z?p3q0<1`EptqowE3TDhIL9JFvN>m^f&lP@wA9Ly1m5s9
z*9p9YJ}$`Fz#(xw3#DGNenij7q!~|taD3`TtNS`m2$9wZ=68RJt}{=&pA^kLRf&A(
zge`!6a-9rHzCyDo+ALzUApEPuxsULu;GRS3YiX^>(&T7%KFJ5|HvO9(>JTEa;ZmOI
zx|VJ@UfR#OYz62vvTiG(S=vlr)0)tCzoq(+j+C^HU}OH}b6F)>=%Z83|OXT?sz
zGqAI*|K-<`Ivrn_QdDQ;N)-c2q7{E`4;?9nsiaHwvOhWVSod4fT1RKP2M
z2`#DTirat7+=>oVhmL6Fp9&*6h}m3XJFa1hGu1noDeK$b0QmyGv}F3hcF#+GDyKA
z-Pih#Ycw|TOn_;3*^6M>@Hl5nh3Us6NLpWDb405s~f-F)2>5Mk1>Q>vkM
z+t|6H72@lAVF=TE1eV9h)twJha20So;}oc^I;_682MQO|1LC6WB
z3LBO<6(Y`ev)<9e&_*XFbLbC%4-_Ny0mAUA%waeUB9JN}M=c(S{R#lHO5R8sQwo%<
z!JO7OHw#}_+IHe73`_>{5nao_jc(-#&1e}CE{1VP4;}kj-5HkGBE6jDto|gK37M>=
z9YrAfsC8DKfJmv8(B7E})c+K}lxSxCFc%=p36S6Ju$y#LuB{?ux%g%fzYfnzQ{VAY4$eA%*Y+9jtPh@cR*BC6$c;+61l{2Ck
zguz-AV(q!i9(3wo!_L7c*y!~5Hue5S5L?qmWM)zfg$DTG$r7RnGf;*WW^#scuHN8z
z-(zph%R`}Z-92~&OFlB8R=r2pKVY4IQ}Os9MKzbEC|{N#pdVzi@>V1Lb{DzoB|^u>
zBOC`9LwyX831j2G7Y{n>7qDFv(bf_4!sP0+vg=CTl5}0;qf-!3}#Du
ze-ACAj{4DySe&6U8CeP|%Y{s|V+&L*%mI;`izTYUWZJC8E4mpxlMm4rNglB{@k|oy
z-pnmqAq=0nQj{WDjAt5_``NmB_Zy&MSW4OuL56~@u7PgI&=XLz3W=hh8j{-hqoKm0KFxD?UH
z6c$i3BQ(cEgXNbDhKpZs=(mEH#K5B0#}9vWzDC3DB?dDBg&Ego$A#18C<74~>vdR_
ztg9EVz7HX{FIihNw#lrL`wuoZYe7>3HielW$6*71kmI_ysx2DrM?u+99gxOmly8S{
z@v`GH_6Hg)TLXo>R#pBH6+g4^?`7R|j*qblh>WU=OApLge#Z9DuCjm7KD3a!5EMn9
ziH91&N7BHagpUizCPww5cS7SgY#|&N$5cXVBOUz`wI};2HFb9y#%xLW7r7Z222Lpv
zx;jH5IdgXC?kWQ*-stMzP!#<__Od}ArNjJZV9G=%!x%&AfR~i8dDqMscrB3$YsNiC`w<}L~WJ=@2T^!0JLpiU);q0|m@
z#k~TAakmdON9n~$CwX$?C^9v25sUR5MbCw%-7*2un7OcAn*W|I3*0Xh0+tXd!;%t+5?+
za=4uBfqy5N+$`#8TOq*8vNamUYblbtGKaK9QLG+xXj=h*p`sjxe_+6a-vRCbfpBPU
zN*~jFw28z7z026>{lo$y-06C@>NsP4J+6L>+oWG@Z4
z>UG0ZX@ZDrH@N_+A#9GZf(uS1MX}^8o>Mo?RT(rpp>8p9h7(A_a#Y%{2&-3s-EA()
zg^ESe==0{q{;c*_3z4zEUA~|MNcpe(a#ZiKG>Yz{ejk}tj0$0ZRil)Ip`VSgHiQ!B
zU>}p1i;?cHlPpI4Q4WK&Uo9~;UW=N{7B}Sk+17wu0^=I0mRk>V3GqNaUk|q%l|unp
zcxYx4uS{i9?9e-D16ZQ+Ay&fdrGaId5KM5GZlS6N?ulge^3DaPj#xB}_6D8BhdChh
zLt31kKon_>Qu9Y?s~8zY(xr;8L&;ijMd)jOFWMQwgv_PoK8&8ZXq%935Fcd%Xj+zl
zwcvK>VJ>6IhpIG72FMi&4-JQn>nJHQqf8s7U3R;)6#OvFo!qYE9lKY*-H3?xo1VOX
zJw4Q@al-9TmR(p$zGBFFH3Ys4OEMuN6>orD*g?^GyJiZo_RL85UCua3<*fa39XhDbO(HKcC$b6m13kd>caB_)MhUJ
zQpr-Eh3oif>i;SxMeyA#D|&WQeQOrx$1Snn>AsN5l`8POE1$o${6hIa%F1qhh=cN0RfLL(H@q-9JZL3w}D{ueh&o
z#$lEm0i5MPb@-;}d;Y5Mm+5lkhH83~@D*zZS!pk;st+oRk8J4IHHcbee9bdK2ZzhZ
z55g82)!1MT8CgqOp8aUuNPGn_=3Ir8K|n_M%rH6M)1Fhq;{!Q0!BK9EeuD}XZb&XH
zS)1qp9!7qdcZNYA!7(7a~kn4V{Tx_|#cRV1QdS@ascFvZs#U=wG!k2LU&CIv~TP=W#=OkTx@{$3T5i5RZ}XH76W#}svm&HnLt
zQlZ0%_X}p+8G>If(vL)`C$k5e3OwnEbn`Eu_hyjsx!y
z{Q)J8uYmU7vHi2p4iY4K&U2p=|7}0r?3!tY8s1F
zIo2=)HNefG2Ao;B#d7I~27qlx{+e9ryWg7XMg!DYL`W2#Ws}r{R0CyM3F!}7oQCyv
zXaH$|R)fsMc*`xfzg#lmYy=JEs7A<=)8fz-ZWT3`*jpT%3*Z>XjUL#0ESfGhDJ`Rv
zkVizQ7?{M#D!%vMo{mHYTa^0oj{{`6Tf*kFYz0GU=6pX9NbxfkRWj~;*T*wQd^i4N
z0yV~q9SbSjZplSEDAccfkEXA#s!-*Q@ikuV6#%IiclJ)R;DBhy3C-HxLkU2)-13Vk
zD#S>kaT%()xM7IkjKy~{VBPns?3KXnDz0H`q&)Bi!r-=6cK5K|#~fRPf~F7xr&ZYH
zib+7+O^6>6w=rIR3S+OgNBDsf@-%wSOrwA%*%(2R+A&>+N+>iV)dkMGw6;d66KS%8
z7>IU8IHw&i=>;jG{S!Sne)6NiBAA_3YfyM^KNRP&Lw)*t29l(U+MYx`AZZUC50wb=tE%YpSk$B+ClACq6N!_H6OYi^gzTF^OEhWju^YeRD;`g>xUbry#-&}d}W_-Ga?Gc+CDR(nZ;yI5H16BceQs=
z%J_KUu#(qG$8=6XQsgc14P`TC*c?AM_<~c(((!66Mc}gQW)Ah|=SPxGUAw_gWfId=
zhp~7Pzy0$~1#9Qv9H(H$^s`<8m}OuC<}Wn(AWv@GiJE}Yr(plO#wJ*v&9QjE
zFl;fGL70!3)02G@7)-0>v4D`)UqAsWnY3Us39%nq(cwlS5msR&%zr*GYvb=Xz{&t%
zkzsnMb5y5OoDP=@h0N5t)EU-p@N6fOG}uB2C}TKEf`SmD
zg#=34<7hlse@$Q9xY97`Sm&ft{k=62ysjNCkPvM;pm!hrEe>NnX>T%Wfy;4|hrAf!
zFvXjmd{Zd@^#f?32>YZ3bh80VS+TU^{B8hO_Lc3*^r6|}DYf|0^auhbWc}-kdO$8X
zoyX%@+cUU8!BFDC(8v?W{!%ZeAWp1p;rk7$WnQ&%P%zSg8tSP4Rm*_D6#Po0YGU4K
z4(MS{(91gT<1Q#!yiDEDalLoXhX`;}+JtkDZDsb6M3meBkncmR%;_4?_}FET1WTAD
ze;LqxReE`fMHTs2L@_s%ji#DoB@7Rc#^R93g~=z}*ecAHN5Lh11@sA*oWJY9PEc-tu>>M60F3aLg2>G|;m=8*D4Xp-u>_T$(G?P7Cuf$<))D?KE|
zhVcRUfjPosRkI-x&G17F?(b0?2$XUI45!!=U)8c6ewlG@Lc>Rb&O{oRV|w61Bgd>*
z8Rhfz5XLt4Q#9)1`#djZgJ?ekwK;c(#MQ>w<;V5qPz>qD2UKzY3@6h$HQ(37YC6->
zQz=>1oJxmyjo0=&pbfH0dW0|6LWM>%Tv4ocFd$uA$}BBJX25Qkym(A!ydJX^w*-GN
zwTtfTaI_z;{3WdB7|03aZk-y8`uW`#i{_VYT=xS*VLAC70PRR>w>`&^oW0VqT)MPg
z*NI@_CVNj6gepo}&798<^PZ&|9;;mtK$tZG7zgd>K%XWyD{#YAo@lFax5?X!
z##LaBuY74;*DTkz&>^?n3cY4X-v?QkVXT&aVy$z|-B6>>-E|IBv@Z%S9E+V)NpH4(
zA3y&`+Wsv$ng8CG@Cm}?NhYoXh`mrfo6&)9veBE^U42UzHjgW{INdNGw`kA3oK{$x
zR{QRnXq^m`H8cf93nHJn~ZKQ
zJ-$UEk6THfGG#!rJK~7!^95wq`^XACq%szx1Yfy!jHkc+uJ5wb+9&%V$$sx9?1Db90&r
zhh_k{{TU
z26GJgS(HMUKcE?PvR@spenN%=slieUn)mS2oCPv`f&0W0iI{UuS{`Om^4H_4aF&Mw
zfED654iI$HMsW*Yf?Q|P9$HkSQPvGQHK-tCCdGo@%{eIb%4328cj;bW#R!tO@!bXq
zH}A^r)wf%%5bs(<<^qP`Ymp`?t0AA_GX1iWeLx`ZT=bo0c--(WE3sM#@HsYWupyk+
z+&X>2-BM;+H9s>K@P}>iF;^lvjC^v@N}Nz-2q>IR50%4KDdQAvX)bk7B=LV@g$c$(
z^+A;)M#m{G2V(d?X+qH3bBdM=q1~QZqWWn0q|;;`_^0?Kz%Bt|Qw*T4Q%>!nNj6#nYa60NO&jvw&MlN(0wXpKD9B{0|AqeU_`xq1b#1oW@2OwJTa
zdOiNIHcyt3^!~m@f{X&Q8LuK&Txw
zPLo`pM1-st86ABAvUO?bvUIs!*(u59(?ZWVjJ_1X%L9=
zBJ~m3JXRlT^EuhrCXWMHuauUm7>4IGeBF@E^pyyh?{E@Lfr*6cm3cbdVwi@a#ynOpI3MLgfc`%fj
zow1{EI9j5bACaiS@FRk>k*$kCWi`!(90#r~)%OjV;RYlN;IY0ck*+g8DZ0auh!-o&
zqc&4CHTGM$>2+HM%eb{A>r|=&v2Hewm%zku@rPDGky6pBDr9STI`ylIgc(*5gj+``
z(o~Q^XtT=JB%rO&Ej*K0_nAUw(9)DTnvXDxCA$(e2cy(hqSA>og{jCl2y1s5Yj5cw
zYEi13`fiP1Cble4qGbnQM&`$ng)v-gBA)H_8n%>Yiqh;NQq>Cm#`ZPAAwT%K%t$e3
z=XVHOflP8Ca<~QNDu$qE2+jM-pX~R#CD&yHnD>l84L;gZeiwR$e8yD_gSpTan`l97
z_E5hHaY7}&V902FgQ!j1!ute+X)PE4@@ns$v`u$~56^+vs!>9rJau!lBK&Bjh=GNK
zMPgT40ol0PBtI%vM`uB=5{(_D(Ub_yp+SQfU45(|3sXX>g80=wshE{S
zw9EAn?olj(vJ
z<-UWEIeSYjl1V`a;hpFo)Iym6#SDcq@Q!7L?ecgHDM9v}`vEZeI}sz|`Vpie)t=hP
z)TAEB-R|Im#?r<@N4NZ=SvuO=>-VHWd)hwJW3D>~JkP^y{h!dAxOAtrIDl?rWBhwt
z&~{9jaCI_*`R`orpb0h__T8km%&_ybM}4FO(hfVA!onI~{hMQe`_iFmO|~Ts0(>|a
zCYp|l9}CwJwrX4Z_}6!7vs`=&&SyK|S^m&_lA4CPe#ga?=K{7fRu02Lo27_Ie!Yvy
z15M%Hp-(kkh^GLzet=^Fr!0pw?w#^+DxKawmxf{s>X6AKeRq!Cs5oP9a6
zE;XQNJ=eZCI7@j2kj?jgAh-DNf(yLtuSji5b_%6Q-FDTBJl&OlpKA&(i2I%-gCfV1
zWa{bb&>Mt%=YyM;)@ns%qxRii*A2FX|G>oXTwq43y`>dwAhsthVwHzDe5yV_kQK8a
zqMflT4->Hrj>mvB>^jkKJXQSWwm5;ye9_DAiNipXRad%5l3#0Mnu$T)wW?Wx^U^kP
z%L>>1MICeAH#SbpwpMymn+KIAym$3M(Pz~i^CJvTidEBPq-a&E>d=iUcBcMGN7>{*
z`$ZF(QX94r!q4(q%upyaX&g>2hN7BhmtUHn?^vUkmX%oMF&3=cWL#Z~hI6V$$%|b}
z{>6M!_z&h?L0^24orOo_4;6xaa$2#;QI)Ln6lcPN(0=i-!d1+`8f{k+gtfmDuCfs8
zz4s_>u5W)lZ;VVTA9r+h!}z>xWj&VlAh0HsrA|b7cTb9_kZ$1v`N9X3axo%`Q)?
zr?ogL^D1rq5R{gkmsH@_55)O?)Vt;r%93Pv3u%CDpG|`=F=6sHeC_}v;yio1_ZEa3
zuYmHwra~66SQR)&xl#V7QjUVLbPb;C+cKM9sFARMteLz9PKCaG*puxJ4Opt1VDN1<
z*`w-OF+L`RQdRokso7Q8jD2_A25bn&5E`+
z?bIL&nO|ef$CYzAuessAsmAkc%>!Uu4dGlgd>qa=(g{
zp_*AluZQnJ4e>i1F9#=il8@T{xT616YBgh&bPsk85hlW>dwAxU6#;^BGT6a!x0`-=8C93(`M*L8(<620Hn|HCsbJo`cERvi)gPJn?w7r$ti
zy(PVMFfK+Qr&$DwpF7rO=r_z?rFHkX@o0(`pfOi*JTu&om;och4!o&i?~+a;dJI*#
zg~2X2^QH#8)E~&jk8yl!Za!4XFg;?&|HfIxTJ(s{9Bv1&mW@z7-Sk?e`M<}SsHWxj#@
z+YkFAf}^$Uggf2ex*Yao2-6XbNtSp;X(DM3XAJJqFP@5Eq
zbSYe*J}WnTNNdj&yW})0ek?P(ZELZ=q7M5#?3T(b^2g=$!)&N5O(n)fnNH!p4$%GJ
zB2-z?Fl-t<+>PJyGSmQFABuZo%ZH{@GK`t51f_V;Nw5Dh>?%=u1DRab(C9lER2OGi
z{5A?hx*~h)8pwzSgazI9zn>dU=EyyEUMh}Gl*rB7uRw73p~JLUvrvzqW`6G1Krv6|
zKsfQf)YahC2cn4!%R-{J^o_k66^77Z)g6D#5L%=40*E2h-17)>ZqrrrHqP!CRgf@O
z%XBEYPX!EQWJhR}33QpIQSY|zzCqT&?Ct2>7<)|3U)UUt0CVvnR4e-<2fgChyWQ;y
z9!0CPBdWKVabT}7UVN|hm_Ijx(ca`e+9S{P_YGSkrsqs^Dl<;(cYy+$o(n9La9&0_
zu{R^0S7@kUqb%>z3)$!FU3MaOPU&t#{f}c7^ewTQ2@O0i(v6`7pDOjX!>51PrHPe?
z?|EFR2U-*V6gSGt6AX==X3j$a$nl#o>Qk=&mFu^l6Io^3YceJxe1(7=BJFdv?t?0_0T%0=#_it8sb^njQbv%@s%38
z>143O1!SB1z?gqj+(dk_zl(oygkCa#KZysPrpfpX5~do;}R
ztoaSit`3)L=}GFE1Klg|o>8AApV`v;GU69JIP)&wL83!LVuu2G;(-s=bt;Cy>EzpY
zuK+$iQEu)L0xh~O+TyZLO9sN-K136cS@n|jl?m~yavzv!v0WPL?l@V4$&Tmkb4^7L
z8a1H+#x9+`vLtFyTwYh37|IM1OIS*ZvXLKL%Q9yKH|{y7QfY7QFj0}9U_h>G2RuNx
zIV)O44@^$abfC*pWlRl#@i^4hSj#aZ#BAA*nR}U3D#l29nS}7bjGA&vHz3luyUF9t
ze#qD7rO9AKB$F*}Vm_D&rIh<+yRBG!d9F-B2VF_db+aPy!hP5jwnA&kIu
zrFz3Yb1H)9i#@2?GPVCr^Bir|jr+(**0~u8*4qSdnECTL`~~JtXtbJZ;p$C9(l@=M@ln~eV<#p+iX1!?+vHeO1ybp_=
zIsug1s!OT^>!hFYF?-8;iGB)q1>d1t`+m5QJAY~6E^e9Gnwf3(PV?jW2-p{tHFyTi
zSP}PeeqtGrl>xvYA51nt4n$yVE@CHWQlBQNFOv>p=BQLr@9@reweFm9d6IstK3DGP
zzx)!OA|w|#Rk904Eog1fES0On8P*#BiriIx?hPVR6fVKFBpU9n5|8OIkN)7cXo=Tu
zUs)i(cquQ0Z26-be|?#+`;SnK5nIevGgHSlrfjVR_?VF4@7eu|LQ
z0t0tm%?xdlcTMxC+)~dtFD`USdiB0-;Dv$=DV|AN9K>+ZSIkcEJLW8b#JAXZwS9rJ
zz)*J=?wb4X6@ce;Xz#+dnS}2xjZE}%s5|*WHRH^N6lbB2>!pv2^Pr1UYpZ0Ov_%Gv
zzc@5#PLm1XbdhX^4i;J-jO|;T$+(M=GR=4eqZ`+O7)!3qGe^Gsbct9Zx(xn2N-C|f
zymjb7uAJCCLb&ZV@i$Fzn6+q~WYp0tVk>R)UFM5c&9T!O$#j&U>S#UFHMdwju9C=D
z&!XGIyj5{c@-~rh5TwSB-U=ztC%!?Pu=@R+>SH~M!j6x5z}#tk0|b|oR&N_I^jNO(
z#T&!}6oCdh_P5yzw~)=^j0|a&52<)3>7S6&_4%%T(EbtqtA{0fkis9u7pa`IH9i%;
z9?3?Z7P-k@Se|dMbnW{vbr?+KVNB{u0@%uk%pF3qaOsTH4R}zatt?llf&1+3J7_c?
za7kU`HGc>>$0abO~hBm20=MP9G{(D7E^MW
zvTUkQC_k5^5R}V>u>93FCy`MDlMs9hf{@fFt>*4Xai$#TBy;;3*Mzx!D^v}(8CVGj
zAh`E_h(Y$ZQwxI`ZqBas>nAAt?2$6=_#=&Y2_^d@D--LJqznF7Q^vOn?%b4?aL^3E
zr_We?63<39Mq-g=5Dw*=T9eo3QKjQSX2yy-KfpkUAp5}fp~^PlDN!{4y{yrkAB$xU
z`p9q5q6Ur^x(gq-q@Lo9>|DxbIu`{q+jRrNeJMo#V+|7JB0}60&h~Y8GJ?lByxtUA
zAF{9p*g_5-Igftpzs&k?ZeKw6<9v9t)7@V2nu1W8`o)kOKu1xHgWBg44#OJLjp$Ft
z@|k-LOvDj^xoquUtY#z^J^X|7H4RLW5mwrQ!+^KD$1lm|H|IDgqg!)ozA!m~zbQ7s
z8Eo{Ny@UX&Qc?DtKg4>cwVMU2VyKrdF}MAx8cF`)x`N4G>5n0aj2~G;VW+U(>l*&AA4EYyEq)3MN~^i
zwUdCXX_?#|N{R*vTOg_}eHLbLHV@JIpQ5y=AYarWQPo#~w&a2t*Jhjspz4cdO0s~P
z=AEXB2-P`jcF~W00s^eT5Pl-_Z&M2;0RqoyB2-V3Gw{1P$&OAmL<9n9e`Gh>JWL`s
zU6z>oUf>oA2zb3&f_H6J2aaPOE`&=RyB`R8$Z?g{$pzj)ZEIH)K7A6(zbK{uBe?&&
z!o3SKK(b`r!b!l-GQ)$$o#Aa2@b3%86Xo0!Mm~w!*_v_t-x=k$Ae>GN|C0tb*p*@Q
z=XUR5g3Mfr-KdkA+phI-P=%u|_;PJBmRm5JS8kU3uqcc!)sSnbD_v#M>L&pF+s?(TcjLJhHYP@^xETo
z6^%E!IX`*P%zJ-*Wb;)%`O(}Kv-ej%n)QcTOi-w6FsEFRbJXCv$lW*FjkceSGejpr
z56pTte5Z%wA9RnF&RzjqyaO%u#~*a1&rYMvgTXc}h=JW#D1OxI9j${Ks=coOR2900
zGiYp+oFwyfWcvPd6?@ZxNz=w}oW=Y>7as`4BG2IN*(YHD2zV?J(=k>#j>Of$`?FCY
z&53#cF1?Bw@&YZM_|4x)&zXg@TzU6Jqv2)DrH%t{z+HAcbKCPbfx#kz;TO1sZMp`7
zsWP&!_EknJ(&-z3V83|#{Q;jDS-K$sPrSX;b5j*rRJ=al6DAV%yZr+D#+lK5s#HS0
z3Q%H;s{YKL;jVmT!Sl}N4hxYr2hRTKz^|&G7v&E?!2I@?jqDw7+oPvst+vVeiFozrlCm2X4*a6azW*JqvSh>RA_AnPSQ6
z=~Bl&c4(~Ife~~STcNVrf=77L;P6S)TKJ^MuTkXABJw;n`enW_NrRS=7Sr(DJOD+3bIyZb!qHrM81<
zb263Dw{!;@#s91nMIvVikh(Lrcd)z(CAu8i7KZoLzR9s{1_#cNTbe?AizEq(&0WX*
z4~jx}rshd30&F=#fU9D1rr?7QKZ|9wulgHZXPR5bDfBkVee_MQle*D)t^()rT-B(GT$KAsEw9BB^{jqT)oz3GDEhpu0Fvj
zc)X-HCOBiwrjL>K`9z>9+i1JdM5ZTm^^LaL?yhYfX}RzgWuXZjGt0SFa)x&BadHZ-
z;)u8jR1H5P_$_Cr;A-_E;*@;%_oyEZ%6KgiiOtu%x%5g6thtyjoMj2O|VRFSYPbZ
zkN`FU7ug5xo=EmT_~W;ZKnbi4GdFl^ISj26*A_1dpcPbS?=
z1%LnsH2QSik6P;VyJnr&~efoTY{IKWf8Ph>%}3}gW%1!&XtQO>OXojY@otZ1baSt
zQU10gg9dt@(EPo^h|}0aWpI~hStPw*yf_>S35TL;^wbzK;LTznQy}BZtcy2&Fn>&Q
zha4g%wid+)@)Fu`J2cu$a=s;?rzTp9*a@;}Gi1Sk1!&FJk21m3L0AN&U9cO83TPZb
zSAw6&!K4rx10Es!vz0y*po8Z*LW-N2GnNT8#Ym!zA5)f_(duP})+r;N%386Xxv90U
z$AEi)<=oZpNK;I%maTI*&BEA)UqAkgl!7hB7mqbfthYs%1l6Wls)}az5N7;hR^`W%
z(bvRD`K~LS^H=Y08039=s9{6q#OtU4V}QZYQ^U;0&mOtPfoTu;ky3o4K5tL0`7fM9
zNrO1;@nZ2@v0q%iUyDXMSi#mWk&FQ&_h1wGh99mfKAQbPbKSm^8$2|rI4ingzbiZE
z@y(p+O+I+=b8MI>2uMXeY^eTS7&v*R`+_{Y`3+$dzhYZh@dX=o=drQ+8{@iYrJu{7
zlXhF={!decSAb0d>68Cl|Bvv(d;D-1Cz-Q)*S^kn2IgRtXZ$uz9#V9keS
z*r1Z+EzvM(eE!ecEp@MeRcMjkPMkuyYb@qcHwt1$z#2h+8+}7fsi|`7GS8xIHcIL~
zh}Pe;3Uux>@cY;>iWlc(-514uX+lPncszXb>0#A%^)GQj+Rz#Itd33>kE#~7b2kxH
zjTbkU)=_1k`CE?gvw-%z6>+QzC_9V?jSzMg?S$xUUm42$x2>QIrZVp3YD%N*HSuE;
z(v9GyAxPRXL-Q;fi}H@dZUzlnsoZKQIM$}Vh4vq^B#JA`N@UpkCoCH4;Up+=C#=bmM*UV&%HX?ewtT6+WY7$%C4WC
z2vY%m+YX~dg^$Lcm4SlCLv|zEzd`%UtqdY;hYw2LW$!z_wwEP;-whoCdf$7HF08vPL`7Z9rEcFNq*-rfuWjV36mXwP%<NyhkHH>
zOa=!@*CX9!(G%=l2HtwzR)&(x&f=h(U&|{Xw7_HLH{J&MyC5dx0O9(D3MVte)LC;d
zzGbYt>LqQ<8eRW#d;&r~Y3xtVVi|l3{*iC50ApcU%cHJOtyiLFAMezaNo3>dEpkXZ;&eop*STRF}+FX-SxW69VtIeyXw
z^*8M`jZ+OQ)w2^gO*GFQlenT7AzQld%BD?0CHs{RIa@T67H(Nu-RoHB(LKLQ{spAm
zq3R3{-(1ftHZn$XmP(XqiJKvOxZXdj3V8nZ^xmV;plUffsnuDSKC0i6pQw}NOsEl+
z>yfkB?bwtnKzs(T(n^L9Q3%V_-?w&35pqDaE#x^N%6PB3^_G^4eK*rru1mCIc|SJ+
zB%%$l-4f7}QZw&uQF>(afj$7xJ(&@aI>Zd3X1!%