A Data Science Central Community

(Nested Processes ... Each Process controlled by a Solver)

- The
**transfer function**H(s) is the Laplace transform of the output signal Yout(s) divided by the Laplace transform of the input signal Yin(s): that is H(s) = where each signal's transform is assumed to be a ratio of polynomials. Thus, H(s) can likewise be stated in the form:

- Assuming the numerator and denominator can be factored, yields H(s) in the general form:

- where each Z
_{i}is known as a "zero" and the P_{i}as a "pole" of the transfer function. Z_{i}and P_{i}are complex points in the Laplace domain.

- A realizable transfer function must have poles and zeros with their conjugate point. That is, poles and zeros come in pairs. If a pole or zero is located at the complex point s
_{i}+ jw_{i}, then its conjugate is located at s_{i}- jw_{i}. Thus, a generalized transfer function is stated as

- Given n-data points from a Bode plot (see drawing below) that define the mainlobe of the desired transfer function, find the optimal Pole/Zero constellation such that H(s) has equal sidelobe peak amplitudes in a Bode plot and curvefits the following data in the mainlobe.

- A Calculus-level program for this optimal matched filter transfer function is as follows:
Problem .Matched.Filter.Transfer.Function execute .Setup for i = 0 to 1 sidelobes = i [Include Zeros On Omega Axis ? ]

**Find**gain, p.real, p.imag

In .Laplace.Domain To Match error

repeat

End

Model .Laplace.Domain

if sidelobes gt 0 AND omega.zeros gt 0 then

for ij = 1 to omega.zeros do

side.limits( ij) = x.zeros( ij) * (1 + move( ij)) * (2**(ij-2))

old.zeros( ij) = x.zeros( ij)

repeat

**Find**x.zeros In .Stopband By HERA

With Bounds side.limits

To Minimize peak.diff

for ij = omega.zeros, move( ij) = x.zeros( ij) - old.zeros( ij)

close

for ij = 1 to npoints do [ --- Calculate Transfer Function ---- ]

x2 = freq( ij) ** 2

execute .Transfer.Function

if den eq 0, den = 1e-8

h( ij) = gain * num / den

error( ij) = y.out( ij) - h( ij) * y.in( ij)

error( ij) = error( ij) / y.out( ij) [relative error ... line optional]

repeat

End

Model .Stopband [locate sidelobe peaks]

peak.diff = 0

for ijk = 1 to omega.zeros do

step.limit = side.limits( ijk)

**Find**x.peak In .Sidelobes By HERA

With Bounds step.limit

To Maximize y.peak

peaks( ijk) = x.peak peak.ampl( ijk) = y.peak

if ijk gt 1, peak.diff = peak.diff + (y.peak - peak.ampl( ijk-1))**2

repeat

peak.diff = peak.diff + (y.peak - peak.ampl( omega.zeros))**2

End

Model .Sidelobes [calculate sidelobe amplitude at frequency 'x.peak']

x2 = x.peak**2

execute .Transfer.Function

y.peak = num / den

End

Model .Transfer.Function

num = 1 den = 1

for ii = 1 to p.pairs, den = den * .Factor( x2, p.real( ii), p.imag( ii))

if omega.zeros gt 0 then

for ii = 1 to omega.zeros, den = den * .Factor( x2, 0, x.zeros(ii))

close

End

Function .Factor( x.sq, sigma, omega)

real.sq = sigma**2 Imag.sq = omega**2

sum = real.sq + imag.sq - x.sq

if omega eq 0, exit with sum / real.sq

temp.f = sum * sum - 4 * x.sq * imag.sq

End with temp.f / (real.sq + imag.sq)**2

Procedure .Setup

freq = .data( 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,

13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24)

y.in = .data( 51.31, 37.79, 28.26, 21.1, 15.37, 11.32, 8.06, 5.83, 3.91, 2.69,

1.78, .96, .52, .31, .25, .21, .18, .15, .12, .10, .08, .07, .06, .06, .05)

[y.out(w) = optimum response in the linear Van der Maas sense ]

y.out = .data(1, .981, .926, .836, .73, .609, .484, .365, .26, .173,

.105, .058, .027, .012, 7.2e-3, 4.4e-3, 2.7e-3, 1.7e-3,

1e-3, 6.2e-4, 3.8e-4, 2.3e-4, 1.4e-4, 8.8e-5, 5.4e-5)

fmax = freq( npoints)

npoints = 25 gain = 1 p.pairs = 5 omega.zeros = 3

allot h( npoints), error( npoints)

allot p.real( p.pairs), p.imag( p.pairs)

for i = 1 to p.pairs do [initial guess]

p.imag( i) = ((i-1) / p.pairs + .11) * fmax

p.real( i) = fmax

close

if omega.zeros gt 0 then

allot x.zeros( omega.zeros), old.zeros( omega.zeros), move( omega.zeros),

side.limits( omega.zeros), peaks( omega.zeros), peak.ampl( omega.zeros)

for i = 1 to omega.zeros, x.zeros(i) = (1 + 2**(i-1) /10)*fmax [initial guess]

close

End

**Note:**This is a multi-level optimization or nesting of optimizers example problem. Each 'find' statement starts a solver and at times has up to three**Find**statements executing at once (ie. nested). This nesting power should allow companies to optimizes at many levels and combine all optimizations for a true optimize company profits when necessary. Nice!

**Note:**Match-n-Freq application is exactly this multi-level optimization problem. Download Match-n-Freq application file. Give it a test run and see the nested solvers at work.

- This
*Filter Designing problem*is another increased productivity example do to using Calculus (level) programming.

- Kost, R.E. and Brubaker, P.B.,
**"../example/pulse-slimming.html">Arbitrary Equalization with Simple LC Structures**, IEEE Transactions on Magnetics, Vol. MAG-17, No. 6, November, 1981

HTML code for linking to this page:

- <a href="http://www.digitalcalculus.com/example/filter-designing.html"><img align="middle" width="100" src="http://www.digitalcalculus.com/image/transfer.gif"/> <strong>Optimum Filter Design</strong> </a>, Bode Diagram, Transfer Function, Matched Filter, Modeling & Simulation.

© 2021 TechTarget, Inc. Powered by

Badges | Report an Issue | Privacy Policy | Terms of Service

**Most Popular Content on DSC**

To not miss this type of content in the future, subscribe to our newsletter.

- Book: Applied Stochastic Processes
- Long-range Correlations in Time Series: Modeling, Testing, Case Study
- How to Automatically Determine the Number of Clusters in your Data
- New Machine Learning Cheat Sheet | Old one
- Confidence Intervals Without Pain - With Resampling
- Advanced Machine Learning with Basic Excel
- New Perspectives on Statistical Distributions and Deep Learning
- Fascinating New Results in the Theory of Randomness
- Fast Combinatorial Feature Selection

**Other popular resources**

- Comprehensive Repository of Data Science and ML Resources
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- 100 Data Science Interview Questions and Answers
- Cheat Sheets | Curated Articles | Search | Jobs | Courses
- Post a Blog | Forum Questions | Books | Salaries | News

**Archives:** 2008-2014 |
2015-2016 |
2017-2019 |
Book 1 |
Book 2 |
More

**Most popular articles**

- Free Book and Resources for DSC Members
- New Perspectives on Statistical Distributions and Deep Learning
- Time series, Growth Modeling and Data Science Wizardy
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- Comprehensive Repository of Data Science and ML Resources
- Advanced Machine Learning with Basic Excel
- Difference between ML, Data Science, AI, Deep Learning, and Statistics
- Selected Business Analytics, Data Science and ML articles
- How to Automatically Determine the Number of Clusters in your Data
- Fascinating New Results in the Theory of Randomness
- Hire a Data Scientist | Search DSC | Find a Job
- Post a Blog | Forum Questions