Skip to content

Commit

Permalink
Fix Typos in the Getting Started section. (#189)
Browse files Browse the repository at this point in the history
Co-authored-by: Janos Gabler <[email protected]>
  • Loading branch information
roecla and janosg authored Feb 6, 2021
1 parent d7d32e7 commit c21bcbe
Show file tree
Hide file tree
Showing 20 changed files with 100 additions and 67 deletions.
2 changes: 1 addition & 1 deletion .conda/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ requirements:
- cloudpickle
- joblib
- numpy >=1.16
- pandas >=0.24,<=1.0.5
- pandas >=0.24
- bokeh >=1.3
- scipy
- fuzzywuzzy
Expand Down
1 change: 0 additions & 1 deletion docs/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@ dependencies:
- ipython
- pip:
- sphinx
- parso==0.8.0
- pydata-sphinx-theme>=0.3.0
- nbsphinx
- sphinxcontrib-bibtex
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@
# built documents.
#
# The full version, including alpha/beta/rc tags.
release = "0.1.1"
release = "0.1.2"
version = ".".join(release.split(".")[:2])

# The language for content autogenerated by Sphinx. Refer to documentation
Expand Down
22 changes: 11 additions & 11 deletions docs/source/getting_started/first_optimization_with_estimagic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# First optimization with estimagic\n",
"# First Optimization with estimagic\n",
"\n",
"This tutorial shows how to do an optimization with estimagic. It uses a very simple criterion function in order to focus on the mechanics of doing an optimization. A more interesting example can be found in the [ordered logit example](ordered_logit_example.ipynb). More details on the topics covered here can be found in the [how to guides](../how_to_guides/index.html)."
"This tutorial shows how to do an optimization with estimagic. It uses a very simple criterion function in order to focus on the mechanics of doing an optimization. A more interesting example can be found in the [ordered logit example](ordered_logit_example.ipynb). More details on the topics covered here can be found in the [how to guides](../how_to_guides/index.rst)."
]
},
{
Expand All @@ -29,7 +29,7 @@
"\n",
"Criterion functions in estimagic take a DataFrame with parameters as first argument and return a dictionary that contains the output of the criterion function. \n",
"\n",
"The output dictionary must contain the entry \"value\", which is a scalar but can also contain an arbitrary number of additional entries. Entries with special meaning are \"contributions\" and \"root_contributions\", which are used by specialized optimizers (e.g. nonlinear least squares optimizers use the \"root_contributions\"). All other entries are simply stored in a log file. If none of the optional entries is required, the criterion function can also simply return a scalar. "
"The output dictionary must contain the entry \"value\", which is a scalar but can also contain an arbitrary number of additional entries. Entries with special meaning are \"contributions\" and \"root_contributions\", which are used by specialized optimizers (e.g. nonlinear least squares optimizers use the \"root_contributions\"). All other entries are simply stored in a log file. If none of the optional entries are required, the criterion function can also simply return a scalar. "
]
},
{
Expand Down Expand Up @@ -159,7 +159,7 @@
"source": [
"## Running a simple optimization\n",
"\n",
"Estimagic's `minimize` function is works similar to scipy's `minimize` function. A big difference is however, that estimagic does not have a default optimization algorithm. This is on purpose, because the algorithm choice should always be dependent on the problem one wants to solve. \n",
"Estimagic's `minimize` function works similarly to scipy's `minimize` function. A big difference is however, that estimagic does not have a default optimization algorithm. This is on purpose, because the algorithm choice should always be dependent on the problem one wants to solve. \n",
"\n",
"Another difference is that estimagic also has a `maximize` function that works exactly as `minimize`, but does a maximization. \n",
"\n",
Expand Down Expand Up @@ -299,7 +299,7 @@
"source": [
"## Running an optimization with a least squares optimizer\n",
"\n",
"Using a least squares optimizer in estimagic is exactly the same as using another optimizer. That was the whole point of allowing for a dictionary as output of the criterion function. "
"Using a least squares optimizer in estimagic is exactly the same as using another optimizer. That is the goal and result of allowing the output of the criterion function to be a dictionary. "
]
},
{
Expand Down Expand Up @@ -398,7 +398,7 @@
"source": [
"## Adding bounds\n",
"\n",
"Bounds are simply added as additional columns in the start parameters. If a parameter has no bound, use np.inf for upper bounds and -np.inf for lower bounds. "
"Bounds are simply added as additional columns in the start parameters. If a parameter has no bound, use `np.inf` for upper bounds and `-np.inf` for lower bounds. "
]
},
{
Expand Down Expand Up @@ -600,7 +600,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"As you probably suspect, the estimagic constraint syntax is way more general than what we just did. For details see [how to specify constraints](../how_to_guides/optimization/how_to_specify_constraints.html)"
"As you probably suspect, the estimagic constraint syntax is much more general than what we just did. For details see [how to specify constraints](../how_to_guides/optimization/how_to_specify_constraints.rst)"
]
},
{
Expand All @@ -609,7 +609,7 @@
"source": [
"## Using and reading persistent logging\n",
"\n",
"In fact, you have already used a persistent log the whole time. It is stored under \"logging.db\" in your working directory. If you want to store it in a different place, you can do that:"
"In fact, we have already been using a persistent log the whole time. It is stored under \"logging.db\" in our working directory. If you want to store it in a different place, you can do that:"
]
},
{
Expand Down Expand Up @@ -670,7 +670,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"The persistent log file is always instantly synchronized when the optimizer tries a new parameter vector. This is very handy if an optimization has to be aborted and you want to extract the current status. It is also used by the [estimagic dashboard](../how_to_guides/optimization/how_to_use_the_dashboard.html). "
"The persistent log file is always instantly synchronized when the optimizer tries a new parameter vector. This is very handy if an optimization has to be aborted and you want to extract the current status. It is also used by the [estimagic dashboard](../how_to_guides/optimization/how_to_use_the_dashboard.rst). "
]
},
{
Expand All @@ -679,7 +679,7 @@
"source": [
"## Passing algorithm specific options to minimize\n",
"\n",
"Most algorithms have a few optional arguments. Examples are convergence criteria or tuning parameters of the algorithm. We standardize the names of these options as much as possible, but not all algorithms support all options. You can find an overview of supported arguments [here](../how_to_guides/optimization/how_to_specify_algorithm_and_algo_options.html)."
"Most algorithms have a few optional arguments. Examples are convergence criteria or tuning parameters. We standardize the names of these options as much as possible, but not all algorithms support all options. You can find an overview of supported arguments [here](../how_to_guides/optimization/how_to_specify_algorithm_and_algo_options.rst)."
]
},
{
Expand Down Expand Up @@ -795,7 +795,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.8"
"version": "3.8.6"
}
},
"nbformat": 4,
Expand Down
2 changes: 1 addition & 1 deletion docs/source/getting_started/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Getting Started


This section is a quick introduction to the basic functionality of estimagic.
Contains not only introductions to estimagic, but also very brief introductions to
It contains not only introductions to estimagic, but also very brief introductions to
numerical optimization and differentiation.

.. toctree::
Expand Down
15 changes: 8 additions & 7 deletions docs/source/getting_started/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ The package can be installed via conda. To do so, type the following commands in
a terminal or shell:

``$ conda config --add channels conda-forge``

``$ conda install -c opensourceeconomics estimagic``

The first line adds conda-forge to your conda channels. This is necessary for
Expand All @@ -20,9 +21,9 @@ and its mandatory dependencies.
Installing optional dependencies
================================

Only the ``scipy`` optimizers are a mandatory dependency of estimagic. Other algorithms
Only ``scipy`` is a mandatory dependency of estimagic. Other algorithms
become available if you install more packages. We make this optional because most of the
time you will use at least one additional package, but only very rarely you will need all
time you will use at least one additional package, but only very rarely will you need all
of them.


Expand All @@ -32,14 +33,14 @@ see :ref:`list_of_algorithms`.

To enable all algorithms at once, do the following:

``conda install nlopt``
.. ``conda install nlopt``
``pip install Py-BOBYQA``

``pip install DFOLS``
``pip install DFO-LS``

``conda install petsc4py`` (Not available on windows)
``conda install petsc4py`` (Not available on Windows)

``conda install cyipopt`` (Not available on windows)
.. ``conda install cyipopt`` (Not available on Windows)
``conda install pygmo``
.. ``conda install pygmo``
2 changes: 1 addition & 1 deletion docs/source/getting_started/ordered_logit_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -545,7 +545,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"This looks pretty good! The parameter estimates line up perfectly. I actually had to try three optimizers to get at least one differenet digit which makes the result more credible. Other optimizers like `nlopt_bobyqa` and `nlopt_neledermead` hit it on all digits!\n",
"This looks pretty good! The parameter estimates line up perfectly. I actually had to try three optimizers to get at least one differenet digit which makes the result more credible. Other optimizers hit it on all digits.\n",
"\n",
"<div class=\"alert alert-danger\">\n",
"Note that standard error calculation, especially in combination with constraints is still considered experimental in estimagic.\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
"source": [
"# Visualizing an Optimization Problem\n",
"\n",
"In order to choose the right optimization algorithm, it is important to know as much as possible about an optimization problems. If the criterion function has only one or two parameters, plotting it over the sample space can be helpful. However, such low dimensional problems are rare. \n",
"In order to choose the right optimization algorithm, it is important to know as much as possible about the problem one wants to solve. If the criterion function has only one or two parameters, plotting it over the sample space can be helpful. However, such low dimensional problems are rare. \n",
"\n",
"In this notebook we show how higher dimensional functions can be visualized using estimagic and which properties of the criterion function can be learned from them. \n",
"\n",
Expand Down Expand Up @@ -87,10 +87,10 @@
"source": [
"The plot gives us the following insights:\n",
"\n",
"- No matter at which value the other parameters are, there seems to be a minimum at 0 for each parameter. Thus coordinate descent is probably a good strategy. \n",
"- There is no sign of local optima \n",
"- There is no sign of noise or non-differentiablities (careful, grid might not be fine enough)\n",
"- The problem seems to be convex\n",
"- No matter at which value the other parameters are, there seems to be a minimum at 0 for each parameter. Thus, coordinate descent is probably a good strategy. \n",
"- There is no sign of local optima. \n",
"- There is no sign of noise or non-differentiablities (careful, grid might not be fine enough).\n",
"- The problem seems to be convex.\n",
"\n",
"-> We would expect almost any derivative based optimizer to work well here (which we know to be correct in that case)"
]
Expand Down Expand Up @@ -158,13 +158,13 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Here the picture looks very differently\n",
"Here the picture looks very differently:\n",
"\n",
"- The minimum along each coordinate seems to strongly depend on the value of all other parameters. This means that fixing some parameters initially and only optimizing the rest would not work well. \n",
"- There is no sign of noise or non-differentiablities\n",
"- There are several local optima\n",
"- There is no sign of noise or non-differentiablities.\n",
"- There are several local optima.\n",
"\n",
"-> We would expect that a gradient based optimizer is efficient at finding local optima. Thus a good strategy would be to run gradient based optimizers from many starting values."
"-> We would expect that a gradient based optimizer is efficient at finding local optima. Thus, a good strategy would be to run gradient based optimizers from many starting values."
]
}
],
Expand Down
12 changes: 7 additions & 5 deletions docs/source/getting_started/which_optimizer_to_use.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,13 +17,15 @@
"source": [
"# Which optimizer to use\n",
"\n",
"This is the very very very short guide on selecting a suitable optimization algorithm based on a minimum of information. We are working on a longer version that contains more background information and can be found [here](../how_to_guides/optimization/how_to_choose_optimizer.html). \n",
"This is the very very very short guide on selecting a suitable optimization algorithm based on a minimum of information. We are working on a longer version that contains more background information and can be found [here](../how_to_guides/optimization/how_to_choose_optimizer.rst). \n",
"\n",
"However, we will also keep this short guide for very impatient people who feel lucky enough. \n",
"\n",
"To select an optimizer, you need to answer three questions:\n",
"1. Is your criterion function differentiable\n",
"2. Do you have a nonlinear least squares structure (i.e. do you sum some kind of squared residuals at the end of your criterion function). "
"To select an optimizer, you need to answer two questions:\n",
"\n",
"1. Is your criterion function differentiable?\n",
"\n",
"2. Do you have a nonlinear least squares structure (i.e. do you sum some kind of squared residuals at the end of your criterion function)?"
]
},
{
Expand Down Expand Up @@ -386,7 +388,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.8"
"version": "3.8.6"
}
},
"nbformat": 4,
Expand Down
Loading

0 comments on commit c21bcbe

Please sign in to comment.