Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit6f8551a

Browse files
authored
Fix issues related to providing a Jacobian function (#973)
* Updated the Minimizer.least_squares method to wrap the Jacobian function- The Jacobian function provided by the user is now handled similarly to how it is handled in, e.g., Minimizer.leastsq.- If the Jacobian function is called 'Dfun' in the keyword arguments, then it is also accepted (Minimizer.scalar_minimize already does this).* Updated the Minimizer.__jacobian method signature to fix exception- The signature now matches the signature of Minimizer.__residual. This was changed because the following exception was being raised whenever the user provided a function for calculating the Jacobian: "TypeError: Minimizer.__jacobian() got an unexpected keyword argument 'apply_bounds_transformation'".* Updated the Minimizer.__jacobian method to use the new parameter- The updating of the parameter values is now similar to how it is done in Minimizer.__residual.* Updated test_least_squares_jacobian_types to handle the wrapped Jacobian function- The params argument of the jac_array function is now an actual Parameters object like it is in the f function, which means that the parameters can be accessed via string keys.* Updated the coerce_float64 function- Added explicit support for sparse matrices (e.g., Block Sparse Row) and LinearOperator objects. This change is needed because the updated test_least_squares_jacobian_types would otherwise fail. Similar to how these types are handled when calculating the Hessian matrix in Minimizer.least_squares.* Renamed the Minimizer.__jacobian method to avoid a bug related to pickling- Attempting to make use of multiprocessing.Pool while providing, e.g., Minimizer.least_squares a function for calculating the Jacobian did not work prior to the renaming of the method due to an issue related to pickling and unpickling of private methods. See the (currently) open issue (python/cpython#77188) and pull request (python/cpython#21480) for more information.* Added a test for when the "least_squares" method is provided a function for the Jacobian- Verifies that a) the fitted parameters have the expected values, b) the jac function is actually called, and c) that there are indeed fewer function evaluations as a result.* Updated the docstring of Minimizer._jacobian- The docstring is updated to include the parameters and return values.* Added tests for (un)pickling Minimizer when a Jacobian function is provided* Updated timeout value in test_jacobian_with_forkingpickler- Increased the timeout value so that slow runners can actually succeed instead of timing out. If no timeout was specified, then a failing test could end up hanging instead.* Added an example that demonstrates the benefits of providing a Jacobian function* Updated the list of contributors
1 parent3935485 commit6f8551a

File tree

5 files changed

+388
-8
lines changed

5 files changed

+388
-8
lines changed

‎AUTHORS.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ Roam, Alexander Stark, Alexandre Beelen, Andrey Aristov, Nicholas Zobrist,
5555
Ethan Welty, Julius Zimmermann, Mark Dean, Arun Persaud, Ray Osborn, @lneuhaus,
5656
Marcel Stimberg, Yoshiera Huang, Leon Foks, Sebastian Weigand, Florian LB,
5757
Michael Hudson-Doyle, Ruben Verweij, @jedzill4, @spalato, Jens Hedegaard Nielsen,
58-
Martin Majli, Kristian Meyer, @azelcer, Ivan Usov, and many others.
58+
Martin Majli, Kristian Meyer, @azelcer, Ivan Usov,Ville Yrjänä,and many others.
5959

6060
The lmfit code obviously depends on, and owes a very large debt to the code
6161
in scipy.optimize. Several discussions on the SciPy-user and lmfit mailing
Lines changed: 178 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,178 @@
1+
"""
2+
Benchmarks of methods with and without computing the Jacobian analytically
3+
==========================================================================
4+
5+
Providing a function that calculates the Jacobian matrix analytically can
6+
reduce the time spent finding a solution. The results from benchmarks comparing
7+
two methods (``leastsq`` and ``least_squares``) with and without a function to
8+
calculate the Jacobian matrix analytically are presented below.
9+
10+
First we define the model function, the residual function, and the appropriate
11+
Jacobian functions:
12+
"""
13+
fromtimeitimporttimeit
14+
fromtypesimportSimpleNamespace
15+
16+
importmatplotlib.pyplotasplt
17+
importnumpyasnp
18+
19+
fromlmfitimportParameters,minimize
20+
21+
NUM_JACOBIAN_CALLS=0
22+
23+
24+
deffunc(var,x):
25+
returnvar[0]*np.exp(-var[1]*x)+var[2]
26+
27+
28+
defresidual(pars,x,data):
29+
a,b,c=pars['a'],pars['b'],pars['c']
30+
model=func((a,b,c),x)
31+
returnmodel-data
32+
33+
34+
defdfunc(pars,x,data):
35+
globalNUM_JACOBIAN_CALLS
36+
NUM_JACOBIAN_CALLS+=1
37+
38+
a,b=pars['a'],pars['b']
39+
v=np.exp(-b*x)
40+
returnnp.array([v,-a*x*v,np.ones(len(x))])
41+
42+
43+
defjacfunc(pars,x,data):
44+
globalNUM_JACOBIAN_CALLS
45+
NUM_JACOBIAN_CALLS+=1
46+
47+
a,b=pars['a'],pars['b']
48+
v=np.exp(-b*x)
49+
jac=np.ones((len(x),3),dtype=np.float64)
50+
jac[:,0]=v
51+
jac[:,1]=-a*x*v
52+
returnjac
53+
54+
55+
a,b,c=2.5,1.3,0.8
56+
57+
x=np.linspace(0,4,50)
58+
y=func([a,b,c],x)
59+
60+
data=y+0.15*np.random.RandomState(seed=2021).normal(size=x.size)
61+
62+
63+
###############################################################################
64+
# Then we define the different cases to benchmark (i.e., different methods with
65+
# and without a function to calculate the Jacobian analytically) and the number
66+
# of repetitions per case:
67+
cases= (
68+
dict(
69+
method='leastsq',
70+
),
71+
dict(
72+
method='leastsq',
73+
Dfun=dfunc,
74+
col_deriv=1,
75+
),
76+
dict(
77+
method='least_squares',
78+
),
79+
dict(
80+
method='least_squares',
81+
jac=jacfunc,
82+
),
83+
)
84+
85+
num_repeats=100
86+
results= []
87+
88+
forkwargsincases:
89+
params=Parameters()
90+
params.add('a',value=10)
91+
params.add('b',value=10)
92+
params.add('c',value=10)
93+
94+
wrapper=lambda:minimize(
95+
residual,
96+
params,
97+
args=(x,),
98+
kws={'data':data},
99+
**kwargs,
100+
)
101+
time=timeit(wrapper,number=num_repeats)/num_repeats
102+
103+
NUM_JACOBIAN_CALLS=0
104+
fit=wrapper()
105+
106+
results.append(SimpleNamespace(
107+
time=time,
108+
num_jacobian_calls=NUM_JACOBIAN_CALLS,
109+
fit=fit,
110+
kwargs=kwargs,
111+
))
112+
113+
114+
###############################################################################
115+
# Finally, we present the results:
116+
labels= []
117+
118+
forresultinresults:
119+
label=result.kwargs['method']
120+
ifresult.num_jacobian_calls>0:
121+
label+=' with Jac.'
122+
123+
labels.append(label)
124+
125+
label_width=max(map(len,labels))
126+
lines= [
127+
'| '
128+
+' | '.join([
129+
'Method'.ljust(label_width),
130+
'Avg. time (ms)',
131+
'# func. (+ Jac.) calls',
132+
'Chi-squared',
133+
'a'.ljust(5),
134+
'b'.ljust(5),
135+
'c'.ljust(6),
136+
])
137+
+'|'
138+
]
139+
140+
print(f'The "true" parameters are: a ={a:.3f}, b ={b:.3f}, c ={c:.3f}\n')
141+
fig,ax=plt.subplots()
142+
ax.plot(x,data,marker='.',linestyle='none',label='data')
143+
144+
for (result,label)inzip(results,labels):
145+
linestyle='-'
146+
ifresult.num_jacobian_calls>0:
147+
linestyle='--'
148+
149+
a=result.fit.params['a'].value
150+
b=result.fit.params['b'].value
151+
c=result.fit.params['c'].value
152+
y=func([a,b,c],x)
153+
ax.plot(x,y,label=label,alpha=0.5,linestyle=linestyle)
154+
155+
columns= [
156+
label.ljust(label_width),
157+
f'{result.time*1000:.2f}'.ljust(14),
158+
(
159+
f'{result.fit.nfev}'
160+
+ (
161+
f' (+{result.num_jacobian_calls})'
162+
ifresult.num_jacobian_calls>0else
163+
''
164+
)
165+
).ljust(22),
166+
f'{result.fit.chisqr:.3f}'.ljust(11),
167+
f'{a:.3f}'.ljust(5,'0'),
168+
f'{b:.3f}'.ljust(5,'0'),
169+
f'{c:.3f}'.ljust(5,'0'),
170+
]
171+
lines.append('| '+' | '.join(columns)+' |')
172+
173+
lines.insert(1,'|-'+'-|-'.join('-'*len(col)forcolincolumns)+'-|')
174+
print('\n'.join(lines))
175+
176+
ax.set_xlabel('x')
177+
ax.set_ylabel('y')
178+
ax.legend()

‎lmfit/minimizer.py

Lines changed: 35 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -553,20 +553,37 @@ def __residual(self, fvars, apply_bounds_transformation=True):
553553
else:
554554
returncoerce_float64(out,nan_policy=self.nan_policy)
555555

556-
def__jacobian(self,fvars):
556+
def_jacobian(self,fvars,apply_bounds_transformation=True):
557557
"""Return analytical jacobian to be used with Levenberg-Marquardt.
558558
559559
modified 02-01-2012 by Glenn Jones, Aberystwyth University
560560
modified 06-29-2015 by M Newville to apply gradient scaling for
561561
bounded variables (thanks to JJ Helmus, N Mayorov)
562562
563+
Parameters
564+
----------
565+
fvars : numpy.ndarray
566+
Array of new parameter values suggested by the minimizer.
567+
apply_bounds_transformation : bool, optional
568+
Whether to apply lmfits parameter transformation to constrain
569+
parameters (default is True). This is needed for solvers
570+
without built-in support for bounds.
571+
572+
Returns
573+
-------
574+
numpy.ndarray
575+
The evaluated Jacobian matrix for given `fvars`.
576+
563577
"""
564578
pars=self.result.params
565579
grad_scale=np.ones_like(fvars)
566580
forivar,nameinenumerate(self.result.var_names):
567581
val=fvars[ivar]
568-
pars[name].value=pars[name].from_internal(val)
569-
grad_scale[ivar]=pars[name].scale_gradient(val)
582+
ifapply_bounds_transformation:
583+
pars[name].value=pars[name].from_internal(val)
584+
grad_scale[ivar]=pars[name].scale_gradient(val)
585+
else:
586+
pars[name].value=val
570587

571588
pars.update_constraints()
572589

@@ -931,7 +948,7 @@ def scalar_minimize(self, method='Nelder-Mead', params=None, max_nfev=None,
931948
# Wrap Jacobian function to deal with bounds
932949
if'jac'infmin_kws:
933950
self.jacfcn=fmin_kws.pop('jac')
934-
fmin_kws['jac']=self.__jacobian
951+
fmin_kws['jac']=self._jacobian
935952

936953
# Ignore jac argument for methods that do not support it
937954
if'jac'infmin_kwsandmethodnotin ('CG','BFGS','Newton-CG',
@@ -1532,6 +1549,13 @@ def least_squares(self, params=None, max_nfev=None, **kws):
15321549
least_squares_kws.update(self.kws)
15331550
least_squares_kws.update(kws)
15341551

1552+
ifleast_squares_kws.get('Dfun',None)isnotNone:
1553+
least_squares_kws['jac']=least_squares_kws.pop('Dfun')
1554+
1555+
ifcallable(least_squares_kws['jac']):
1556+
self.jacfcn=least_squares_kws['jac']
1557+
least_squares_kws['jac']=self._jacobian
1558+
15351559
least_squares_kws['kwargs'].update({'apply_bounds_transformation':False})
15361560
result.call_kws=least_squares_kws
15371561

@@ -1639,7 +1663,7 @@ def leastsq(self, params=None, max_nfev=None, **kws):
16391663
iflskws['Dfun']isnotNone:
16401664
self.jacfcn=lskws['Dfun']
16411665
self.col_deriv=lskws['col_deriv']
1642-
lskws['Dfun']=self.__jacobian
1666+
lskws['Dfun']=self._jacobian
16431667

16441668
# suppress runtime warnings during fit and error analysis
16451669
orig_warn_settings=np.geterr()
@@ -2386,7 +2410,12 @@ def coerce_float64(arr, nan_policy='raise', handle_inf=True,
23862410
lists of numbers, pandas.Series, h5py.Datasets, and many other array-like
23872411
Python objects
23882412
"""
2389-
ifnp.iscomplexobj(arr):
2413+
ifissparse(arr):
2414+
arr=arr.toarray().astype(np.float64)
2415+
elifisinstance(arr,LinearOperator):
2416+
identity=np.eye(arr.shape[1],dtype=np.float64)
2417+
arr= (arr*identity).astype(np.float64)
2418+
elifnp.iscomplexobj(arr):
23902419
arr=np.asarray(arr,dtype=np.complex128).view(np.float64)
23912420
else:
23922421
arr=np.asarray(arr,dtype=np.float64)

0 commit comments

Comments
 (0)

[8]ページ先頭

©2009-2025 Movatter.jp