Sunteți pe pagina 1din 25

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/299398530

Total Variation Applications in Computer Vision


CHAPTER JANUARY 2016
DOI: 10.4018/978-1-4666-8654-0.ch002

3 AUTHORS, INCLUDING:
Vania Vieira Estrela
Universidade Federal Fluminense
77 PUBLICATIONS 79 CITATIONS
SEE PROFILE

All in-text references underlined in blue are linked to publications on ResearchGate,


letting you access and read them immediately.

Available from: Vania Vieira Estrela


Retrieved on: 24 March 2016

Total Variation Applications in


Computer Vision
Vania V. Estrela, Hermes A. Magalhaes, and Osamu Saotome

ABSTRACT
The objectives of this chapter are: (i) to introduce a concise overview of regularization; (ii) to define and
to explain the role of a particular type of regularization called total variation norm (TV-norm) in
computer vision tasks; (iii) to set up a brief discussion on the mathematical background of TV methods;
and (iv) to establish a relationship between models and a few existing methods to solve problems cast as
TV-norm. For the most part, image-processing algorithms blur the edges of the estimated images,
however TV regularization preserves the edges with no prior information on the observed and the
original images. The regularization scalar parameter controls the amount of regularization allowed and
it is an essential to obtain a high-quality regularized output. A wide-ranging review of several ways to put
into practice TV regularization as well as its advantages and limitations are discussed.

Keywords:TotalVariation,Regularization,ComputerVision,MachineLearning,VariationalMethods,
ComputationalIntelligence,ImageProcessing.

1. INTRODUCTION
Thischapterinvestigatesrobustnesspropertiesofmachinelearning(ML)methodsbasedon
convexriskminimizationappliedtocomputervision.Kernelregression,supportvector
machines(SVMs),andleastsquares(LS)canberegardedasspecialcasesofML.The
minimizationofaregularizedempiricalriskbasedonconvexfunctionalshasanessentialrolein
statisticallearningtheory(Vapnik,1995),because(i)suchclassifiersaregenerallyconsistent
underweakconditions;and(ii)robuststatisticsinvestigatetheimpactofdatadeviationsonthe
resultsofestimation,testingorpredictionmethods.

Inpractice,onehastoapplyMLmethods-whicharenonparametrictools-toadatasetwithafinite
samplesize.Evenso,therobustnessissueisimportant,becausetheassumptionthatalldatapointswere
independentlygeneratedbythesamedistributioncanbecontravenedandoutliershabituallyoccurinreal
datasets.

Therealuseofregularizedlearningmethodsdependssignificantlyontheoptiontoputtogether
intelligentmodelsfastandsuccessfully,besidescallingforefficientoptimizationmethods.ManyML
algorithmsinvolvetheabilitytocomparetwoobjectsbymeansofthesimilarityordistancebetween
them.Inmanycases,existingdistanceorsimilarityfunctionssuchastheEuclideandistanceareenough.
However,someproblemsrequiremoreappropriatemetrics.Forinstance,sincetheEuclideandistance

usesoftheL2-norm,itislikelytoperformscantilyinthepresenceofoutliers.TheMahalanobisdistance
isastraightforwardandall-purposemethodthatsubjectsdatatoalineartransformation.Notwithstanding,
Mahalanobisdistanceshavetwokeyproblems:1)theparametervectortobelearnedincreases
quadraticallyasdatagrows,whichposesaproblemrelatedtodimensionality;and2)learningalinear
transformationisnotsufficientfordatasetswithnonlineardecisionboundaries.

Modelscanalsobeselectedbymeansofregularizationmethods,thatis,theyarepenalizingdependingon
thenumberofparameters(Alpaydin,2004)(Fromont,2007).Generally,Bayesianlearningtechniques
makeuseofknowledgeonthepriorprobabilitydistributionsinordertoassignlowerprobabilitiesto
modelsthataremorecomplicated.SomepopularmodelselectiontechniquesaretheAkaikeinformation
criterion(AIC),theTakeuchiinformationcriterion(TIC),theBayesianinformationcriterion(BIC),the
cross-validationtechnique(CV),andtheminimumdescriptionlength(MDL).

ThischapteraimsatshowinghowTotalVariation(TV)regularizationcanbepracticallyimplementedin
ordertosolveseveralcomputervisionapplicationsalthoughisstillasubjectunderresearch.Initially,TV
hasbeenintroducedin(Rudin,Osher,&Fatemi,1992)and,sincethen,ithasfoundseveralapplications
incomputervisionsuchasimagerestoration(Rudin&Osher,1994),imagedenoising(Matteos,Molina
&Katsaggelos,2005)(Molina,Vega&Katsaggelos,2007), blinddeconvolution(Chan&Wong,1998),
resolution enhancement (Guichard & Malgouyres, 1998), compression (Alter, Durand, & Froment,
2005),motionestimation(Drulea&Nedevschi,2011),texturesegmentation/discrimination(Roudenko,
2004).TheseapplicationsinvolvetheuseofTVregularizationthatallowsselectingthebestsolutionfrom
asetofseveralpossibleones.

2. BACKGROUND
2.1 Regularization
Inmachinelearning(ML)andinverseproblems,regularizationbringsinextrainformationtosolveanillposedproblemand/ortocircumventoverfitting.Representativeinformationistakenintoconsideration
viainsertionofapenaltyfunctionbasedonconstraintsforsolutionsmoothnessorboundsonthevector
spacenorm.RepresentativecasesofregularizationinstatisticalMLinvolvemethodslikeridge
regression,lasso,andL2-normforexample.
Ifaproblemhas(a)auniquesolution,and(b)thesolutionisrobusttosmalldataperturbations,thenitis
calledwell-posed.Whenatleastoneofthemisviolated,itisnamedill-posedanditrequiresspecialcare.
Non-uniquenessisaconsequenceofnothavingenoughdataontheoriginalmodelanditisnot
detrimentalatalltimes.Dependingonthedesiredcharacteristicsofagoodsolutionorsomemeasuresof
goodness,thenanestimatecanbepickedupfromasetofmultiplesolutions.Nevertheless,ifonedoes
notknowhowtoevaluateanestimate,thenaverygoodwaytohandlenon-uniquenessistoenforcesome
priorinformationaboutdomaininordertoconstrainthesolutionset.

Instabilityresultsfromanefforttoundocause-effectrelations.Solvingaforwardproblemisthemost
naturalwayoffindingasolution,sincecausealwaysgoesbeforeeffect.Inreality,onehasaccessto
corruptedmeasures,whichmeansoneaimsatfindingthecausewithoutaclosed-formdescriptionofthe
systembeinganalyzed(systemmodel).
Regularizationcanbeisotropicoranisotropiconthesmoothnessterms.Isotropicregularizationschemes
relaxsmoothnessconstraintsatboundaries.Anisotropicformulationsletsmoothingoccursalongthe
bordersbutnottransversaltothem.
Theconceptofregularizationreliesontheuseofnorms.Thischapterwillonlyconsiderexpressionsof
thisform

3
1

p
|| x || p | xi |p ,
i

(1)

wherethemostpopularonesaredescribedasfollows:

|| x ||2 i xi2
L norm:isalsoknownasEuclideandistance.Algorithmsrelyingonitgenerate
2

smoothresults,whichpenalizeimageedges.
L (Manhattan)norm:isthesumoftheabsolutevaluesofthedistancesintheoriginal
|| x || 1 i | xi |
1

space.Algorithmsusingthisnormpreserveimageedges,althoughtheyaretime-consuming.

2.2 Least Squares (LS) Regularization


Givenasystemg=Hf,whereHisaforwardoperator,thesimplestformofregularizinganill-posed
problemisthelinearLeastSquares(LS)withEuclidean(L2)normthataimsatminimizingtheresidual
2LS=J(f)=Hf-g22.Thatis,

(2)

fLS=(HTH)-1HTf,

(3)

whereJ(f)isafunctionaltobeminimized,2LSisthesquaredoftheresiduals,fisanestimateoff
accordingtotheLSsquarescriterionand.22=.2.IfHisill-conditionedorsingular,then(3)maynot
beagoodestimate.AregularizationtermQf2(alsoknownasregularizationpenalty)isincludedinthis
minimizationfunctional,thenitwillleadtotheRegularizedLeastSquares(RLS)estimate(Coelho&
Estrela,2012)(Coelho&Estrela,2012)(Kang&Katsaggelos,1995)(Molina,Vega,&Katsaggelos,2007).
Hence,thenewfunctionalbecomes
J(f)=2RLS=Hf-g2+Qf2.

(4)

ThemostcommoncaseisQ=I,whereisascalarregularizationparameterandIistheidentitymatrix,
neverthelessothertypesofregularizationmatrixQcanbechoseninordertoenhanceproblem
conditioning(suchasafirstorasecondorderdifferentiationmatrix),thereforemakingpossibleabetter
numericalsolutiongivenby:

fRLS=(HTH+QTQ)-1HTf.

(5)

FromaBayesianpointofview,thisisequivalenttoaddingsomeadditionalassumptionswithinorderto
obtainastableestimate.Statistically,fpriorisfrequentlyassumedtobezero-meanGaussianwith
independentandidenticallydistributed(iid)componentswithidenticalstandarddeviationf.Thedatag
arealsoerroneous,andiidwithzeromeanandstandarddeviationg.WhenQ=I,fRLShas= g/ f
accordingtothepreviousexpression.
Generalized Tikhonov regularization (GTR)
Amoregeneralfunctionalis

J(f)=Hf-gP2+f-f0Q2.

(6)

wherefQ2=fTQfisaweightednorm.ABayesiananalysisshowsthatPistheinversecovariancematrix
ofg,f0=E{f},andQtheinversecovariancematrixoff.TheTikhonovmatrixisobtainedfromQ= T
(theCholeskyfactorization),anditcanberegardedasawhiteningfilter.
TheresultingestimateisfGTR=f0+(HTPH+Q)-1HTP(g-Hf0).TheLSandRLSestimatesarespecialcasesof
theGTRsolution(Blomgren&Chan,1998)(Chan&Wong,1998).

Usually,thediscretizationofintegralequationsleadtodiscreteill-conditionedproblems,andTikhonov
regularizationcanbeappliedintheoriginalinfinitedimensionalspace.Thepreviousexpressioncanbe
interpretedasfollows:HisaHilbertspacecompactoperator,plusf andgareelementsinthedomainand
rangeofHrespectively.Thisimpliesthattheoperator(H*H+QTQ)isaself-adjointaswellasabounded
invertibleoperator.

2.3 Total Variation Regularization

TotalVariation(TV)regularizationisadeterministictechniquethatsafeguardsdiscontinuitiesinimage
processingtasks.ForaknownkernelH,thetrueimagefsatisfiestherelationshipgHf.The
approximationsymbolaccountsfornoise.Withthepurposeofimposinguniquenessandcircumvent
distortions,thepredictedimagef canbedescribedastheminimizerof

Anewfunctionalcanbestatedas
1
JTV(f)= Hu-g22+TV(f),

(7)
2

wherer=(x,y)isapixellocation,fisthegradientoff,isthecorrespondingHilbertspaceand0
isahyperparameterwhosevaluedependsontheamountofnoise.TheTVnormisdefinedby

TV(f)= | f |2 dxdy .

(8)

Inthepreviousexpression,fistheLaplacianoff,theterm||Hf-g||2isafidelity(penalty)term.
Recently,variousefficientimplementationoftheTVnormhavebeenproposed.Nevertheless,findingthe
exactvalueofiscomputationallydemanding.Despitesomedifficultiesconcerningthediscretizationof
thepreviousequationbecauseitintroduceshighfrequencyartifactsintheestimatedsolution,itcanbe
proventhattheycanbeavoidedbydifferentTVdiscretizationstrategies(Sfikas,Nikou,Galatsanos,&
Heinrich,2011).ThemainadvantageofTVregularizationisthefactthatthisvariationalapproachhas
edge-preservingproperties,buttexturesandfine-scaledetailsarestillremoved.Giventhatitisnot
possibletodifferentiateTV(f)atzero,asmallconstant>0isplacedintheprecedingexpressioninthis
fashion:

TV(f)= | f |2 2 dxdy

(9)

TheTVregularizationtermallowsselectingamidnumerouspotentialestimatestheoptimalone.Withthe
intentionofenforcinguniquenessandevadeseriousringingartifacts,theestimatedimagefwillbethe
valueoffthatminimizesthefollowingfunctional:

1
JTV(f)= Hu-g22+ | f |2 2 dxdy .
2

(10)

Knowledgeontheimagediscontinuitiesaccountsforthegradientmagnitude | f |2 2 dxdy .Tofind


finvolvestwosteps:todefineadiscreteversionofJTV(f)forimagesandtofindanalgorithmto
minimizethediscreteproblem.Providedthechosenalgorithmconverges,thesolutionwilldependonly
onthediscretizationselected,whichisamodelingconcern.
AnalogouslytowhathappensinRLS(Galatsanos&Katsaggelos,1992), isveryimportantwhenit
comestocontrollingtheamountofnoiseallowedintheprocess.If=0,thennodenoisingisappliedand
theoutcomeisequaltotheoriginalimage.Ontheotherhand,as,theTVtermbecomes
progressivelystronger,thentheoutputimagebecomesmoredifferentfromtheoriginalonethatis
correspondingtohavingsmallerTV.Consequently,theselectionofregularizationparameterisvitalto
attaintheadequateamountofnoiseelimination.
Practicallyspeaking,thegradientcanbeapproximatedbymeansofdifferentnorms.TheTVnorm
introducedby(Rudin,Osher,&Fatemi,1992)isnamedTVfrointhistext.Itisisotropic,L2-based,and
non-differentiable.Ifthegradientoffisf=(Dxi,j, Dyi,j)where

f i, j - f i-1, j
,
Dix, j f =
x
fi , j fi , j 1
,and
Diy, j f
y
x=y=1, then(9)canberewrittenas

TV fro (f ) | fi 1, j fi , j |2 | fi , j 1 fi , j |2 2 .

(11)

i, j

Sinceotherchoicesofdiscretizationarepossibleforthegradient,thenanalternativetoTVfrorelyingonthe
L1-normforanMNimagecanbeobtainedbytakingintoconsiderationthefollowingrelationshipsand
approximations:

TVL - fro (f ) | f i 1, j f i , j |2 | fi , j 1 fi , j |2
1
i, j

| f

i 1, j f i , j | | f i , j 1 f i , j |

i, j

MN | fi 1, j fi , j | | f i , j 1 f i , j |.
(12)

i, j

TVL1-froiseasiertominimize,itisalsoanisotropicandlesstime-consuming.Becauseunravelingthis
denoisingproblemisfarfromtrivial,moderninvestigationoncompressedsensingalgorithmssuchas
(AntoninChambolle,2004)(Donoho,2008)(Friedman,Hastie,&Tibshirani,2010)(Afonso,BioucasDias,&Figueiredo,2011)solvevariantsoftheoriginalTV-normproblem.

ThemodificationtotheLp-normhasaremarkableeffectonthecalculationoff.Ithasbeenshownby
P.C.Hansenthatthesolutionconsistsofpolynomialpieces,andthedegreeofpolynomialsisp-l.

Theproblemcanbebetterstatedbydefiningafunction (t ) 2 t 2 andrewriting(9),(10),and
(12)asfollows:

2
2
1 M N
TV (f ) Dix, j f Diy, j f + 2
2 i 1 j 1
1 M N

(14)
TV (f ) | f i 1, j fi , j | | f i , j 1 fi , j | 2 .
2 i 1 j 1

GradientofTotalVariationfrom(14)isgivenby
1 M N
TV (f ) '| fi 1, j fi , j | | fi , j 1 fi , j | 2 .

2 i 1 j 1
Anewestimateoffk+1canbestatedasafunctionoffkwiththehelpofthisrelationship:

(15)

-1

f k+1 = H T H + L(f k ) H Tg

-1

f k 1 = f k - H T H + L(f k ) T(f k ) .

TheregularizationoperatorL(fk)canbecomputedusingtheexpression

L(f ) DxT diag ' (f ) Dx DTy diag ' (f ) Dy


diag ' (f )
D
0
x
DxT DYT
'

0
diag (f ) Dy

where

(16)

(17)

'i , j (f ) ' Dix, j f Diy, j f .

Thedesiredvalueoffcanbecomputedbymeansofseveralnumericalalgorithms.TheConjugate
Gradient(CG)algorithmisthesimplestone,andwewillneedtheinitialvaluesof,f0,,andK,where
is the error tolerance between estimates and Kisthemaximumnumberofiterations.f0canbean
evenimage(onecolorrectanglewhereallintensitieshavethevalueofthemeanoftheintensities).CG
justcomputestheinitialgradientsandsearchfornewdirectiontoproceed.However,CGwillconverge
slowlyandlinearlytothebestsolution.Theabove-mentionedprocedurecomputestheconjugategradient
of(13)andgraduallyconvergesinalinearmanner.

3. APPLICATIONS OF TV-NORM IN COMPUTER VISION


3.1 Computation Challenges
InordertominimizeJTV(f),thegradienthastobecomputed.DifferentiatingJTV(f)withrespecttof
yieldstothefollowingnonlinearequation:

f
*
T(f) = JTV(f)=-

(13)
+ H ( Hf- g)=0.
|

f
|

Theprecedingminimizationhasthefollowingcomputationalchallenges:
Theprecedingminimizationhasthefollowingcomputationalchallenges:
f
Theoperator .
isextremely
extremelynonlinear;and
f

f
andH*Hcanbeill
beill-conditionedwhichleadstonumericaldifficulties.
numericaldifficulties.
.
f

Theconjugategradient(CG)method
Theconjugategradient(CG)methodcanbeusedtosolve(13).Thisproceduregenerates
generatesconsecutive
approximationsoftheestimation,theerror
theerrorsassociatedtotheiterations,andtheacceptable
acceptablesearch
directionsusedtorevisealltherequired
requiredvariables.

Althoughseveralschemes(Vogel&
(Vogel&Oman,1996)(Chambolle,2004)havebeendevisedtominimize
havebeendevisedtominimize
JTV(f),itcontinuestobeatime-consumingenterprisebecauseitposesseverecomputationalloadsto
consumingenterprisebecauseitposesseverecomputationalloadsto
problemswithlargeHthatlacksomehigh
thatlacksomehigh-speedrealizationtrickand/orsuitablematrixrepr
speedrealizationtrickand/orsuitablematrixrepresentation.

Figure 1- The left column shows noisy images and the right side illustrate TV denoising by means
of the TV algorithm proposed by (Chambolle, 2004)
3.1.1
1 TV Denoising or Deconvolution
The application of Bayesian models to blind deconvolution is complicated when there is a difficult to
handleprobabilitydensityfunction(pdf)involvinghiddenvariables
handleprobabilitydensityfunction(pdf)involvinghiddenvariablesfandHandthegivenobservations
andthegivenobservationsg
(Mateos, Molina, & Katsaggelos, 2005).
2005). This fact makes impossible the use simpler and less

computationally demanding algorithms such as the EM technique. On the other hand, with variational
approachesitisfeasibletocircumventthisproblem.

ThestandardTVdenoisingproblemhastheformof(6)anditisawayofremovingnoisefromimages.It
reliesontheprinciplethatsignalswithtoomuchandperhapsspuriouselementshavehighTV(the
integraloftheabsolutegradientoftheimagehasanelevatedvalue).Hence,reducingtheTV-normofthe
imagewhilekeepingitveryclosetotheoriginalimage,takesoutunnecessarydetailwhereasconserving
significantfeaturessuchasboundaries(Rudin,Osher,&Fatemi,1992).

Thisnoiseeliminationprocedureisbetterthansimplerpractices,whichdiminishnoisethoughatthesame
timewipeoutedgestoagreaterorlessimportantextent.Moreover,TV-normdenoisingisextremely
successfulatpreservingboundarieswhileconcomitantlyeliminatingnoiseinregions,regardlessofthe
signal-to-noiseratio(SNR)(Strong&Chan,2003)(Caselles,Chambolle,&Novaga,2011).

Figure1showsanexampleofimagedenoisingwhenwhiteGaussiannoiseisaddedtothe
originalimagewitha-prioriknownnoise.
3.1.2 TV Minimizing Blind Deconvolution
Inthepreviouscase,deconvolutionwasperformedwiththehelpofaknownpointspreadfunction(PSF).
Blinddeconvolution(BD)allowstherecuperationofanimagefromadefectivelyobservedorunknown
PSFofoneorasetofseveralblurredimages.BDestimatesthePSFfromanimageorimageset.Hence,
therearenoassumptionsmadeaboutH.BDbettersthePSFestimationsandthetrueimagef ateach
iteration.PopularBDmethodscomprisemaximumaposterioriestimation(MAP)andexpectationmaximization(EM)algorithms.AgoodinitialPSFguessfacilitatesafasterconvergence,althoughitis
notindispensable.
(Money,2006)broadenedtheminimizingfunctionalproposedby(Chan&Wong,1998)byaddinga
referenceimagetoimprovethequalityoftheestimatedimageandtoreducethecomputationalloadfor
theBDasfollows:

T(f)=||Hf-g||2/2+1TV(f)+1TV(f).

(14)

Then,theproblemcanberecastassolvingtheequivalentEuler-Lagrangeforms

f
1) H*( Hf- g)-1

(15a)
=0 ,solveforH.
| f |
f
2) H*( Hf- g)-2
=0,solveforf.
| f |

(15b)

Thesubsequentalgorithmwasproposedby(Chan&Wong,1998)anditiscalledtheAlternating
Minimization(AM)method.

AM Algorithm

1) Initialconditions:,f0,,H0andK,where is the error tolerance between estimates, H0 is the


initial estimate of H and Kisthemaximumnumberofiterations.

2) while(k<K)or((fk-fk-1)<do
Solve(15a)forHk.

Solve(15b)forfk.
end

3.1.3 Image Restoration


TherestorationobtainedwithregularizationmethodslikeRLSsmoothesouttheedgesintherestored
image.ThiscanbealleviatedbymethodsrelyingonL1-normregularizationlikeTVregularizationkeeps
theedgesintheestimatedimage.Thepurposeistorecuperatearealimagefromanimagedistortedby
severalsimultaneousphenomenasuchasblurandnoiseusingtheTVnorm.

Theimageformation(forwardprocess)ismodeledmathematicallyusingtheexpression

g=Hf+n,

where,gistheobserveddistortedandnoisyimage,HrepresentssomePointSpreadFunction(PSF)or
blurringfunction,fistheoriginalimage,niswhiteGaussiannoise.

TheLeastAbsoluteShrinkageandSelectionOperator(LASSO)estimatorisashrinkageandselection
procedurerelyingontheL1normforlinearregressioncreatedby(Tibshirani,1996).TheLASSO
estimateiscalculatedbymeansoftheminimizationofaquadraticproblemconsistingofthecustomary
sumofsquarederrors,withaboundonthesummationoftheabsolutevaluesofthecoefficientsfianditis
definedby

g
i

H ij f j subjectedto j | f j | c ,
j

wherecisaparametercontrollingtheamountofregularization.

BlurredandNoisyimage

TotalVariationreconstruction

LASSOregularization
reconstruction

Figure 2. Reconstruction using Total Variation regularization and LASSO regularization


TVtechniquesconserveedgeinformationincomputervisionalgorithmsattheexpenseofahigh
computationalload.RestorationtimeinL1-basedTVregularization,forinstance,ishigherthanwhen
LASSOisused.RestorationtimeusingandimagequalityarerespectivelylowerandbetterwithLASSO
thanwithL1-basedTVregularization.Hence,forsomesettings,LASSOprovidesannoteworthy
alternativetotheL1-TVnorm.Studiesillustratethatanaugmentintheamountofbluramplifiesthe
restorationerrorwhentheL1isemployed.Anincreaseinthenoiselevelexertsasignificantinfluenceon
theresidualerror(Agarwal,Gribok,&Abidi,2007).Nevertheless,sincethereareotherwaysof
calculatingtheTVnorm,thecomputationtimeandtheestimationqualitycanbefurtherimprovedand

10

outdoLASSO.Figure2,showsrestoredimagesobtainedfromanobservedonesubjectedtoblurringand
noise.

3.1.4 Optical Flow Estimation


Incomputervision,theexistentmotionestimationprobleminavideosequencewasfirststudiedby(Hom
&Schunck,1981)anditcanbringinlotsofinformationtohelpunderstandingagivensystem,scenario
and/orproblem.Characteristically,thegoalistoidentifythedisplacementvectorfield(DVF)involving
successiveframesalsoknownasopticalflow(OF).Ontheotherhand,variationaltechniquesarevery
important,andallowforaccurateestimationoftheDVFswhilerootedintheminimizationoffunctionals.
Consideranimagesequencef(x,y,t),where(x,y)standsforthelocationwithinaframedomain,andt
[0,T]indicatestime.Afterthat,thepostulationofunchangingbrightnessalongtime(invarianceofthe
displacedframedifference,alsoknownasDFD)canbestatedas
DFD(f,u,v)=f(x+u,y+v,t+1)f(x,y,t)=0.

(16)

WiththehelpofaTaylorexpansionandafterdroppingallhigherorderterms,oneobtainsitslinearized
form,theso-calledopticflowconstraint(OFC)
fxu+fyv+ft=0.

(17)

Here,thefunctionu(x,y,t)=(u,v)isthewantedDVFandsubscriptsdenotepartialderivatives.There
aredifferenttypesofregularizationforthenon-uniquesolutionoftheOFproblem:
1. Uniformregularizationtakesforgrantedanoverallsmoothnessconstraintanditdoesnotadjust
itselftosemanticallynoteworthyimageand/orOFarrangements;
2. Image-drivenregularizationthatassumespiecewisesmoothnessandrespectsdiscontinuitiesin
theimage(Nagel&Enkelmann,1986);and
3. OFregularizationassumespiecewisesmoothnessandrespectsbordersintheDVFasin(Cohen,
1993)(Weickert&Schnrr,2001).
VariationalmethodsareamongthefinesttechniquesforestimatingtheOFbymeansoferrorevaluation
procedures,howevertheyarefrequentlyslowforreal-timeapplications(Slesareva,Bruhn,&Weickert,
2005)(Brox,Bruhn,&Weickert,2006).Forthemostpart,thecomputationalcostsforsolvingthe
nonlinearsystemofequationsviatypicalnumericalmethodsareconsideredsignificantlyelevated.
Variationalschemesrelyingonbidirectionalmultigridsgeneratearefinedhierarchyofequationsystems
withfirst-rateerrordecrease.
OFalgorithmsbasedonvariationalapproacheshavebeengaininglotsofpopularity,becausetheyhandle
denseflowfieldsandtheirperformancecanbegoodifspatialandtemporaldiscontinuitiesareretainedin
thevideo.Regrettably,thisflexibilityimplieselevatedcomputationalload,butthisdoesnotpreclude
real-timeperformanceifmethodssuchasmultigridareemployed(Bruhn,Weickert,Kohlberger,&
Schnrr,2005).
Variationalcomputervisionalgorithmsbelongtooneofthefollowingclasses:i)anisotropicimagedriventechniquesasproposedby(Nagel&Enkelmann,1986)whichresultsinalinearsystemof
equations;andii)isotropicOF-drivenschemeswithTVregularizationthatinvolvesolvinganonlinear
systemofequations.

11

J (u ,v) (( f x u f y v ft ) 2 (u T D (f )u vT D (f )v))dxdy ,

(18)

where=(x,y)TdenotesthespatialgradientandD(f)isaprojectionmatrixperpendiculartofthat
isdefinedas
1
D (f )
| f | 2 2

f y2 2

f x f y

f x f y a b

.
f x2 2 b c

(19)

servesasaparameterthatpreventsmatrixD(f)frombeingsingular.Theminimizationofthisconvex
functionalcomesdowntosolvingthefollowingequations
f x2u f x f y v f x ft

LAN (u ,v) 0 ,and

(20)

LAN (v,u ) 0 ,

(21)

with LAN ( z ( x,y ), z ( x,y )) div( D (z ( x,y ), z ( x,y ))z ( x,y )) .

(22)

f x f y u f y2 v f y ft

Incontrasttoimage-drivenregularizationmethods,OF-driventechniquestrimdownsmoothingwhere
edgesintheflowfieldoccurduringcomputation.(Drulea&Nedevschi,2011)proposedforthisclassof
variationalOFtechniquesanisotropicmethodthatpenalizesdeviationsfromthesmoothnessconstrain
withtheL1-normoftheflowgradientmagnitude.ThisstrategymatchesTVregularizationanditcanbe
linkedtonormsthatarestatisticallyrobusttoerror.Inthatway,largevariationsarepenalizedmore
mildlythanwhathappenswhenthepopularL2-normisused.Therefore,regionswithlargegradientsasit
isthecasewithedgesarebetterhandled.Rewriting(18)yields
J (u ,v) (( f x u f y v f t ) 2 | u |2 | v |2 2 )dxdy ,

whereservesassmallcontrolparametertoavoidhavingazerodenominatorin(19).Anotherfunctional
thatalsoprovidesaTVregularizationestimateisproposedin(Drulea&Nedevschi,2011).Apparently,
theconsequentEuler-Lagrangeequationsgivenby
f x2u f x f y v f x f t

f x f y u f y2 v f y f t

LTV (u ,v) 0 ,and

LTV (v,u ) 0

areverysimilarinstructureto(20)-(21).However,
LTV ( z ( x,y ), z ( x,y )) div ( D (z ( x,y ),z ( x,y ))z ( x,y ))

12

isclearlyanonlineardifferentialoperatorinzandz ,since

D (z ,z )

1
| z | | z |2 2
2

I ,

withIistheidentitymatrix,b=0andc=a.Soon,itwillbecomeclearthatthedifferentialoperatorLTVis
nonlinearandthatitimpactsseriouslytheresultantdiscretesystemofequations.

AsuitablediscretizationforthepreviousEuler-Lagrangeequationscanbeobtainedviaunknown
functionsu(x,y,t)andv(x,y,t)onagridwithpixelsizeh=(hx,hy)T,whereui,j hstandsforthe
approximationtouatsomepixellocatedat(i,j)withi=1,,Nxandj=1,,Ny.Spatialandtemporal
derivativesoftheimagedataanddiscretizedversionsoftheoperatorsLANandLTVareapproximated
usingfinitedifferencesasfollows:

f xi2,, hj uih, j f xih, j f yih, j vih, j f xih, j ftih, j LhANi , j uih, j 0


h) h
h
h
h h
f ( hxi , j ) f (hyi , j ) u(hi , j ) f ((2,
yi , j ) v( i , j ) f ( yi , j ) f ( ti , j ) LANi , j vi , j 0

wheretheoperatorLANi,j hindicatesLANdiscretizedatsomepixellocatedat(i,j).Theprevious
expressionsamounttoalinearsystemof2NxNyequationsinui,j handvi,jh.DiscretizingtheEulerLagrangeequationsforthecorrespondingTV-basedmethodleadstothenonlinearsystemofequations
shownunderneath

f xi2,, hj uih, j f xih, j f yih, j vih, j f xih, j ftih, j LhTVi , j uih, j 0


h) h
h
h
h
h
f (hxi , j ) f (hyi , j ) u(hi , j ) f ((2,
yi , j ) v( i , j ) f ( yi , j ) f ( ti , j ) LTVi , j vi , j 0

HerethefinitedifferenceapproximationofLTV(u, v)andLTV(v,u)yieldstheproductofacommon
nonlinearoperatorLTVi,j h(ui,j h,vi,jh)andthepixelsui,j handvi,jh,respectively.

3.2 Solutions and Recommendations


TVnormscanbediscretizeddifferentlyfromwhatwasshowninprevioussectionsiffinitedifferences,
withatypicalgeometricarrangementsofclosepixelsinvolving3,4or8neighbors,and/orspecialnorms
areused.InSection2,theTVnormregularizationwasstatedsothatitcouldbenefitfromthefast
algorithmproposedby(Chambolle,2004).

Adiscretizationprocedureisanapproximatedrepresentationofrealcontinuoussignals.Sincethepixel
dimensioncannotusuallybeselectedincomputervision,presupposingitissmallenoughmaynotbe
appropriate.TV-normsfoundedonfinitedifferencesarealsoarguableduetotheneedofhavingsubpixel
accuracyinalgorithmsthatrequire,forexample,sub-pixelinterpolation.Accordingtothesampling
theory,thediscretizationprocedureshowninSection2isnotgood,becausenotwithstandingthefactthat
fwassampledconsistentlywithShannonstheorem,thesquaresin|f|bringinhighfrequenciesthat
requiresmallersamplingintervalstobeattenuated.Therefore,theestimationof|f|hasproblemsdue
toaliasandtheresultingTVnormestimateoffwillcarryartifacts.

13

DespitethefactthatusingTV-normcandecreasefluctuationsandimproveregularizationofinverse
problemsincomputervisionwithoutcompromisingedges,ithassomeundesirablesideeffects:

i.
Thesolutiontothe(Rudin,Osher,&Fatemi,1992)modelispronetocontrastlossdueto
scalingoftheregularizationandfidelitytermsbecauseitdecreasestheboundedTVnormofa
function,inthevicinityofitsmean.Ingeneral,areductionofthecontrastdecreasesthe
regularizationtermofthe(Rudin,Osher,&Fatemi,1992)modelandbooststhefidelityterm.

ii.
GeometricalterationsmayperhapsappearsincetheTVnormofafunctionisreducedoncethe
lengthofalllevelsetsisdecreased.Sometimes,thisdistortssilhouettesthatarenotpartofthe
shape-invariantsetwhenthe(Rudin,Osher,&Fatemi,1992)modelisemployed.Still,for
circularparts,(Strong&Chan,2003)havedemonstratedthatshapeiskeptforasmallvariationin
aswellaslocationalbeitinthepresenceofmoderatenoise.Cornersmaysufferdeformationas
well.

iii.
Staircasingreferstothecasewhentheestimatedimagemayappearblockyoutsidecornersdueto
thehighvaluesofthelevelsetscurvature.TVnormamendments,whichincludehigher-

orderderivatives,areanalternativetothisproblemwhenthereissensibleparameter
selection.

iv.

Eventhoughextremelyvaluable,theTVnormcannotalwayskeeptexturesbecausethemodel
from(Rudin,Osher,&Fatemi,1992)hasthepropensitytoaffectsmallfeaturespresentinimages
anditcansufferbecauseofscaling.Hence,theneteffectistextureloss.

TheNUMIPADlibraryhasacollectionoftechniquestosolveinverseproblemssuchasTikhonov
regularization,TotalVariation,BasisPursuit,etc.(Rodrguez&Wohlberg).Otherverygoodpackage
writteninMATLABistheL1-magic(Candes,Romberg,&Tao).

FUTURE RESEARCH DIRECTIONS


Formodelssuchasg=Hf+n,itisnotviabletostateunambiguouslytheprobabilisticrelationship
associatedtotheconvolvingfunctionswhengisknownandBayesianinferenceisused(Likas&
Galatsanos,2004)(Mateos,Molina,&Katsaggelos,2005).Avariationalschemecanhelptoovercome
thishindrancewithhigherperformancethanconventionaltechniques.Theprincipaldeficiencyofthe
variationallineofreasoningisthelackofsystematicevaluationprocedurestoappraisethevariational
boundstiffness.Evidently,morestudiesonthistopicandoptimizationproceduresarerequired.Still,the
suggestdmethodisratherextensive,sothatitcanbecombinedwithotherBayesianmodelsforseveral
imagingapplications.

Itisawell-knownfactthatLSestimateisnotrobusttooutliers.Therearesomerecenteffortsin
compressedsensing(Candes.E.J.&Wakin,2008)(Candes,Romberg,&Tao,2006)thatfocusontheL0
-L1equivalencetodeterminethegradientwhichbestreplacesthedegradedgradientfield.Theseworks
investigaterobuststrategiestoestimategradientbytakingintoconsiderationerrorcorrectionsalongwith
conceptsfromresearchonsparsesignalretrieval.Amongotherthings,theyconfirmthatthelocationof
errorsisasimportantasthenumberoferrorswhenitcomestothegradientintegrationrequiredbytheTV
norm.
ToreconcileTVwithShannonTheory(Shannon,1948),theTVofadiscreteimagefcanbedefinedas
beingtheexact(continuous)TVofitsShannoninterpolateFwhichisequaltotheFouriertransformoff.

14

However,sinceTV(F)cannotbecomputedexactly,(Moisan,2007) usesaRiemannsumwithan
oversamplingfactorn,anddefinetheSpectralTotalVariation(STV)off(ofordern1)inthismanner:

STVn f 1 / n 2 0 k ,l nN DF k / n, l / n .

STVn(f)issupposedtoyieldafineestimateofTV(F)foranyf ,giventhatthismeasureisa
regularizationterm.n=1isnotagoodoption,becausecontrollingthegradientnormofFonlyatgrid
pointsdoesnotpermititscontrolbetweengridpoints.Whenn=2,anewTVdiscretizationisobtained
alongwithseveralimprovements:gridindependence,compatibilitywithShannontheoryandthe
possibilityofachievingsubpixelprecision,attheexpenseofapplyingFourierTransforms,whichisa
widespreadnecessityindeconvolutionproblems,forinstance.
AdaptiveTVnormcalculationsareneededinordertobettercarefortexturealongwithfine-scale
details.Afterwards,theadaptiveprocedureenforceslocalconstraintsdependingonlocalmetrics(Gilboa,
Sochen,&Zeevi,2003).
Despitethefactthatmostworksconcentrateonscalarfunctions,theextensionofthereasoningusedin
thischaptertocolorormulti-channelimagesremainsanimportantchallengebecauseitrequiresvector
valuedparametersand/orfunctions.Thisgeneralizationisnottrouble-free,butitcanresultfrom
geometricmeasuretheory.
Theimprovementofpropermultigridapproachesbecomesmorecomplicatedthankstotheanisotropy
and/ornonlinearityofthebasicregularizationstrategies,buttheycanleadtoreal-timeperformance.

CONCLUSION
TVregularizationisabroadlyappliedmethodbecauseitkeepsimageedges.Developmentsonthis
techniquearecenteredmostlyontheapplicationofhigherorderderivatives(Chan,T.;Esedoglu,S.;Park,
F.;Yip,2005)(Stefan,W.;Renaut,R.;Gelb,2010)(AChambolle&Lions,1997)(Yuan,Schnrr,&
Steidl,2009)(Chan,Marquina,&Mulet,2000)andonnonlocalsimplificationstoo.Thefundamental
ideabehindTVregularizationcanbenefitfromtheuseofamoregeneraldifferentialoperator.This
increasesflexibilitybecauseitaccountsfortheoccurrenceofalinearsystemandassortedinputs.

InviewofthefactthatthereexistfurtherwaysofestimatingthedivergenceoperatorrequiredbytheTV
norm,thetotalingtimeandimagequalitycanbemadesuperior.

Toaugmenttheefficiencyoftheregularizationprocedure,alternativealgorithmsrelyingondualformsof
theTVnormcallforfurtherresearchontopicssuchasexponentialsplinewavelets(Khalidov&Unser,
2006)orgeneralizedDaubechieswavelets(Schwamberger,Le,Schlkopf,&Franz,2010)(Vonesch,
Blu,&Unser,2007).Undeniably,thesewaveletimprovementscanbetunedtoagivendifferential
operatorandtheiruseforregularizedcomputervisionpurposescorrespondstoasynthesisprior(Franz
&Schlkopf,2006).Prospectiveresearchisalsowantedtostrengthenamorepreciserelationship
involvingdiscretedomainapproachesandsuitableformsofTVregularizationinthecontinuousdomain.
Currentwork(Vonesch,Blu,&Unser,2007)hasshownthattheusualTVnormcalculationviatheL1normwithfinitedifferencescanbeassociatedtoappropriaterepresentationsofstochasticprocesses.

Compressedsensing(CS)enablesrecoveryofcompressedimagesfromasmallnumberoflinear
measurements(Kienzle,Bakir,Franz,&Schlkopf,2005)(Candes.E.J.&Wakin,2008).Itcanbean

15

alternativetomethodsthattrytohandlemissinginformation,butIinvolvelargerimagerepresentations.
Ithadbeenwellknownthatwithoutnoisecontamination,imageswithcompletelysparsegradientscanbe
recoveredwithahighdegreeofaccuracythroughTV-norm.Hence,thereareseveralCSalgorithms
relyingonTVregularizationbecauseaccordingto(Candes,Romberg,&Tao,2006)theyhave
outstandingoutputsinthepresenceofimageswithsparsediscretegradients.TVmethodsalsoworkwell
withpiecewiseconstantimages.Furthermore,sinceimagesareeasiertocompresswhenthediscrete
gradientrepresentationisused,theTV-normhasadvantagesoverwaveletsinthepresenceofadditive
and/orquantizationnoise(Jiang,Li,Haimi-Cohen,Wilford,&Zhang,2012).

Noisecanobliterateimageanalysis.Imageswithunknownnoisecanbehandledwithnopriorsifa
waveletdecompositiontechniqueisusedwithanon-isotropicTVfilteringinawaythatthereisgainfrom
boththemultiresolutioncapacityofthewaveletaswellastheedge-preservingpropertiesoftheTV-norm
(Zhang,2009).

REFERENCES
Acar,R.,&Vogel,C.R.(1994).Analysisofboundedvariationpenaltymethodsforill-posed
problems.InverseProblems,10(6),pp.1217-1229.
Afonso,M.V.,Bioucas-Dias,J.M.,&Figueiredo,M.A.(2010).AnaugmentedLagrangianapproachto
linearinverseproblemswithcompoundregularization.ICIP 2010 17th IEEE International
Conference on Image Processing(pp.41694172).IEEE.
Afonso,M.V.,Bioucas-Dias,J.M.,&Figueiredo,M.A.(2011).AnaugmentedLagrangianapproachto
theconstrainedoptimizationformulationofimaginginverseproblems.IEEE Transactions on Image
Processing, 20(3),pp.681-695.
Agarwal,V.,Gribok,A.,&Abidi,M.(2007).Imagerestorationusingl1normpenaltyfunction.Inverse
Problems in Science and Engineering, 15(8),pp.785809.
Alpaydin,E.(2004).Introduction to Machine Learning.Cambridge,Mass.,USA:MITPress.
Alter,F.,Durand,S.,&Froment,J.(2005).Adaptedtotalvariationforartifactfreedecompressionof
JPEGimages.J. Math. Imaging and Vision, 23(2),, 23(2),199211.

Blomgren,P.,&Chan,T.F.(1998).ColorTV:totalvariationmethodsforrestorationofvectorvaluedimages.IEEE Transactions on Image Processing,7(3),304-309.IEEE.Retrieved


fromhttp://www.ncbi.nlm.nih.gov/pubmed/18276250
Brox,T.,Bruhn,A.,&Weickert,J.(2006).Variationalmotionsegmentationwithlevelsets. European
Conference on Computer Vision (ECCV),Vol.1,471-483,2006.
Bruhn,A.,Weickert,J.,Kohlberger,T.,&Schnrr,C.(2005).Discontinuity-preservingcomputationof
variationalopticflowinreal-time.(R.Kimmel,N.Sochen,&J.Weickert,Eds.)Scale Space and PDE
Methods in Computer Vision,3459,279-290.Springer.RetrievedonAugust29,2014from
http://www.springerlink.com/index/jbevcvnfegkdrw0k.pdf

16

Candes,E.J.,Romberg,J.,&Tao,T.(s.d.).AcessedinAugust1st,2012,availableatl1-magicsoftware.
http://users.ece.gatech.edu/~justin/l1magic/
Candes,E.,Romberg,J.,&Tao,T.(2006).Robustuncertaintyprinciples:exactsignalreconstruction
fromhighlyincompletefrequencyinformation.IEEE Trans. Inform. Theory, 52-2,pp.489509.
Candes.E.J.&Wakin,M.(marchde2008).Anintroductiontocompressivesampling.IEEE Signal
Processing Magazine, 25(2),pp.21-30.
Caselles,V...,Chambolle,A.,&Novaga,M...(2011).Totalvariationinimaging.(O.Scherzer,Ed.)
Media, 1(1),pp.1015-1057.
Chambolle,A,&Lions,P.L.(1997).Imagerecoveryviatotalvariationminimizationandrelated
problems.Numerische Mathematik,76(2),167-188.Springer.doi:10.1007/s002110050258
Chambolle,A.(2004).Analgorithmfortotalvariationminimizationandapplications.Journal of
Mathematical Imaging and Vision,20(1/2),89-97.Springer.
doi:10.1023/B:JMIV.0000011321.19549.88
Chan,T.F.,&Shen,J.(2005).Image Processing and Analysis - Variational, PDE, Wavelet, and
Stochastic Methods .SIAM.
Chan,T.F.,Marquina,A.,&Mulet,P.(2000).High-ordertotalvariation-basedimagerestoration.SIAM
J. Sci. Comput., 22-2,pp.503516.
Chan,T.F.,&Wong,C.K.(1998).Totalvariationblinddeconvolution.IEEE Transactions on Image
Processing,7(3),370-375.Retrievedfromhttp://www.ncbi.nlm.nih.gov/pubmed/18276257
Chan,T.;Esedoglu,S.;Park,F.;Yip,A.(2005).Recentdevelopmentsintotalvariationimagerestoration.
Handbook of Mathematical Models in Computer Vision(pp.17-30).NewYork:Springer.
Cheng-wu,L.(2009).Afastalgorithmforvariationalimageinpainting.2009 AICI International
Conference on Artificial Intelligence and Computational Intelligence,3,pp.439-443.
Cohen,I...(1993).Nonlinearvariationalmethodforopticalflowcomputation.Proc. Eighth
Scandinavian Conference on Image Analysis,1,pp.523530.Tromso,Norway.
Coelho,A.M.,&Estrela,V.V.(2012a).Data-drivenmotionestimationwithspatialadaptation.
International Journal of Image Processing (IJIP),6(1).Retrievedfrom
http://www.cscjournals.org/csc/manuscript/Journals/IJIP/volume6/Issue1/IJIP-513.pdf
Coelho,A.M.,&Estrela,V.V.(2012b).EM-basedmixturemodelsappliedtovideoeventdetection.In
P.Sanguansat(Ed.),Intech.doi:10.5772/2693
Coelho,A.M.&Estrela,V.V.(2012c).Astudyontheeffectofregularizationmatricesinmotion
estimation,IJCA,2012(submitted)

17

Kang,M.G.,&Katsaggelos,A.K.(1995).Generalchoiceoftheregularizationfunctionalinregularized
imagerestoration.IEEE Transactions on Image Processing,4(5),594-602.Retrievedfrom
http://www.ncbi.nlm.nih.gov/pubmed/18290009
Donoho,D.(2008).Fastsolutionofl1-normminimizationproblemswhenthesolutionmaybesparse.
IEEE Transactions on Information Theory, 54(11),pp.4789-4812.
Drulea,M.,&Nedevschi,S.(2011).Totalvariationregularizationoflocal-globalopticalflow.Proc. 14th
International IEEE Conference on Intelligent Transportation Systems (ITSC),(pp.318-323).
Efron,B.,Hastie,T.,Johnstone,I.,&Tibshirani,R.(2004).Leastangleregression.Annals of Statistics,
32,pp.407-499.
Fontana,R.(2004).Recentsystemapplicationsofshort-pulseultra-wideband(uwb)technology.IEEE
Transactions on Microwave Theory and Technique, 52(9),pp.2087-2104.
Franz,M.,&Schlkopf,B.(2006).AunifyingviewofWienerandVolterratheoryandpolynomialkernel
regression.Neural Comp., 18,pp.3097-3118.
Friedman,J.,Hastie,T.,&Tibshirani,R.(2010).Regularizationpathsforgeneralizedlinearmodelsvia
coordinatedescent.Journal of statistical software, 33(1):1.
Fromont,M.(2007).Modelselectionbybootstrappenalizationforclassification.Machine Learning,
66(2-3) ,pp.165207.
Galatsanos,N.,&Katsaggelos,A.(1992).Methodsforchoosingtheregularizationparameterand
estimatingthenoisevarianceinimagerestorationandtheirrelation.IEEE Trans. Image Processing
,pp.322-336.
Gilboa,G.,Sochen,N.,&Zeevi,Y.Y.(2003).Texturepreservingvariationaldenoisingusinganadaptive
fidelityterm.Proc. VLSM 2003.Nice,France.
Guichard,F.,&Malgouyres,F.(1998).Totalvariationbasedinterpolation.Proc. European Signal
Processing Conf.,3,pp.1741-1744.
Hom,B.,&Schunck,B.(1981).Determiningopticalflow.Artificial Intelligence, 17,p.185203.
Jiang,H.,Li,C.,Haimi-Cohen,R.,Wilford,P.,&Zhang,Y.(2012).Scalablevideocodingusing
compressivesensing.Bell Labs Technical Journal, 16-4,pp.149-169.
Khalidov,I.,&Unser,M.(Apr.de2006).Fromdifferentialequationstotheconstructionofnewwaveletlikebases.IEEE Trans. Signal Process, 54-4,pp.12561267.
Kienzle,W.,Bakir,G.,Franz,M.,&Schlkopf,B.(2005).Facedetection-efficientandrankdeficient.
(Y.Weiss,Ed.)Advances in Neural Information Processing Systems, 17,pp.673-680.
Likas,A.,&Galatsanos,N.(Aug.de2004).Avariationalapproachforbayesianblindimage
deconvolution.IEEE Transactions on Signal Processing, 52, Issue 8,pp.22222233.

18

Mateos,J.,Molina,R.,&Katsaggelos,A.(2005).Approximationsofposteriordistributionsinblind
deconvolutionusingvariationalmethods.Proceedings of the IEEE International Conference on
Image Processing (ICIP 2005),2,pp.II-770-773.
Moisan,L.(2007) Howtodiscretizethetotalvariationofanimage? PAMM Proc. Appl. Math. Mech.7,
10419071041908(2007)/DOI10.1002/pamm.200700424
Molina,R.,Vega,M.,&Katsaggelos,A.K.(2007).FromglobaltolocalBayesianparameterestimation
inimagerestorationusingvariationaldistributionapproximations.2007 IEEE International
Conference on Image Processing(Vol.1).doi:10.1109/ICIP.2007.4378906
Money,J.H.(2006).VariationalmethodsforimagedeblurringanddiscretizedPicard'smethod.
University of Kentucky Doctoral Dissertations. Paper 381.(
http://uknowledge.uky.edu/gradschool_diss/381).Kentucky,USA.
Nagel,H.H.,&Enkelmann,W.(1986).Aninvestigationofsmoothnessconstraintsfortheestimationof
displacementvectorfieldsfromimagesequences.IEEE Transactions on Pattern Analysis and
Machine Intelligence, 8,p.565593.
Reddy,D.,Agrawal,A.,&Chellappa,R.(2009).EnforcingintegrabilitybyerrorcorrectionusingL1minimization.Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition
CVPR09,(pp.2350-2357).
Rodrguez,P.,&Wohlberg,B.(s.d.).Numerical methods for inverse problems and adaptive
decomposition (NUMIPAD).Acessoem2012,disponvelem
http://sourceforge.net/projects/numipad/
Roudenko,S.(2004).Noise and texture detection in image processing.LANLreport:W-7405-ENG-36.
Rudin,L.I.,&Osher,S.(1994).Totalvariationbasedimagerestorationwithfreelocalconstraints.Proc.
IEEE Int. Conf. Image Processing,1,pp.3135.
Rudin,L.,Osher,S.,&Fatemi,E.(1992).Nonlineartotalvariationbasednoiseremovalalgorithms.
Physica D, 60,pp.259268.
Schwamberger,V.,Le,P.H.,Schlkopf,B.,&Franz,M.O.(2010).Theinfluenceoftheimagebasison
modelingandsteganalysisperformance.In:R.Bhme,&P.Fong(Ed.),Proc. of the 12th Intl. Conf.
on Information Hiding,(pp.133-144).Calgary.
Shannon,C.(1948).Amathematicaltheoryofcommunication.Bell System Technical Journal, 27,pp.
379423and623656.
Sfikas,G.,Nikou,C.,Galatsanos,N.,&Heinrich,C.(2011).Majorization-minimization mixture model
determination in image segmentation.CVPR 2011(pp.2169-2176).IEEE.
Stefan,W.;Renaut,R.;Gelb,A.(2010).Improvedtotalvariation-typeregularizationusinghigherorder
edgedetectors.SIAM J. Imag. Sci.,232251.

19

Strong,D.,&Chan,T.(2003).Edge-preservingandscale-dependentpropertiesoftotalvariation
regularization.Inverse Problems, 19,pp.S165S187.
Tibshirani,R.(1996).Regressionshrinkageandselectionviathelasso.J. R. Statist. Soc. B, 58,pp.267
288.
Vapnik,V.(1995).The nature of statistical learning theory.NewYork:SpringerVerlag.
Vogel,C.R.,&Oman,M.E.(1996).Iterativemethodsfortotalvariationdenoising.SIAM J. on
Scientiffic Computing, 17(1-4),pp.227-238.
Vonesch,C.,Blu,T.,&Unser,M.(2007).GeneralizedDaubechieswaveletfamilies.IEEE Trans. Signal
Processing, 55-9,pp.44154429.
Wang,L.,Gordo,M.D.,&JiZhu,J.(2006).Regularizedleastabsolutedeviationsregressionandan
efficientalgorithmforparametertuning.Sixth International Conference on Data Mining,(pp.690
700).
Wang,Y.,Yang,J.,Yin,W.,&Zhang,Y.(2008).Anewalternatingminimizationalgorithmfortotal
variationimagereconstruction.Journal SIAM Journal on Imaging Sciences, 1,pp.248-272.
Weickert,J.,&Schnrr,C.(Decemberde2001).Atheoreticalframeworkforconvexregularizersin
PDEbasedcomputationofimagemotion.International Journal of Computer Vision, 45(3),pp.
245-264.
Yuan,J.,Schnrr,C.,&Steidl,G.(2009).Total-variationbasedpiecewiseaffineregularization.Proc. of
the 2nd Int. Conf. Scale Space Variation. Methods in Comput. Vis. (SSVM),(pp.552564).
Zeng,T.&.(2010).Onthetotalvariationdictionarymodel.IEEE Transactions on Image Processing.
Zhang,Y.(2009).Users Guide for YALL1: Your ALgorithms for L1 Optimization.TechnicalReport
TR09-17,DepartmentofComputationalandAppliedMathematics,RiceUniversity,Houston,Texas,
USA.
Zuo,W.&.(2011).Ageneralizedacceleratedproximalgradientapproachfortotal-variation-basedimage
restoration.IEEE Transactions on Image Processing, 20,pp.2748-2759.

ADDITIONAL READING SECTION


Charbonnier,P.,Blanc-Feraud,L.,Aubert,G.,&Barlaud,M.(1997).Deterministicedge-preserving
regularizationincomputedimaging.IEEE Transactions on Image Processing,6(2),298-311.
Retrievedfromhttp://www.ncbi.nlm.nih.gov/pubmed/18282924

20

Christiansen,O.,Lee,T.-M.,Lie,J.,Sinha,U.,&Chan,T.F.(2007).Totalvariationregularizationof
matrix-valuedimages.International Journal of Biomedical Imaging,2007(2007),27432.Hindawi
PublishingCorporation.Retrievedfrom
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1994779&tool=pmcentrez&rendertype=
abstract
Coelho,A.M.,Estrela,V.V.,doCarmo,F.P.,&Fernandes,S.R.(2012).Errorconcealmentbymeans
ofmotionrefinementandregularizedBregmandivergence.Proceedings of the Intelligent Data
Engineering and Automated Learning -IDEAL 2012,(Eds.):Yin,H.,Costa,J.A.F.,Barreto,G.A.,
13thInternationalConference,Natal,Brazil,650-657
Combettes,PL,&Pesquet,J.C.(2003a).Totalvariationinformationinimagerecovery.Proceedings
2003 International Conference on Image Processing Cat No03CH37429(Vol.3).
doi:10.1109/ICIP.2003.1247259
Combettes,PL,&Pesquet,J.C.(2003b).Imagedeconvolutionwithtotalvariationbounds.Seventh
International Symposium on Signal Processing and Its Applications 2003 Proceedings(Vol.1,pp.
441-444).Ieee.doi:10.1109/ISSPA.2003.1224735
Combettes,PatrickL,&Pesquet,J.-C.(2004).Imagerestorationsubjecttoatotalvariationconstraint.
IEEE Transactions on Image Processing,13(9),1213-1222.IEEE.doi:10.1109/TIP.2004.832922
Drapaca,C.S.(2009).Anonlineartotalvariation-baseddenoisingmethodwithtworegularization
parameters.IEEE Transactions on Biomedical Engineering,56(3),582-586.
doi:10.1109/TBME.2008.2011561
Estrela,V.V.,Rivera,L.A.,Beggio,P.C.,&Lopes,R.T.(2003).Regularizedpel-recursivemotion
estimationusinggeneralizedcross-validationandspatialadaptation.SIBGRAPI(pp.331-338).
Retrievedfromhttp://dblp.uni-trier.de/db/conf/sibgrapi/sibgrapi2003.html#EstrelaRBL03
Farcomeni,A.(2010).Bayesianconstrainedvariableselection.Statistica Sinica,20(1),10431062.
doi:10.1007/s00439-008-0582-9
Figueiredo,M.A.T.,Bioucas-Dias,J.M.,&Nowak,R.D.(2007).Majorization-minimization
algorithmsforwavelet-basedimagerestoration.IEEE Transactions on Image Processing,16(12),
2980-2991.Retrievedfromhttp://www.ncbi.nlm.nih.gov/pubmed/18092597
Figueiredo,M.,Dias,J.,Oliveira,J.,&Nowak,R.(2006).Ontotalvariationdenoising:anew
majorization-minimizationalgorithmandanexperimentalcomparisonwithwavaletdenoising.IEEE
International Conference on Image Processing 2006,(5),2633-2636.
doi:10.1109/ICIP.2006.313050
Fornasier,Massimo,Langer,A.,&Schnlieb,C.-B.(2009).Aconvergentoverlappingdomain
decompositionmethodfortotalvariationminimization.Numerische Mathematik,116(4),645-685.
Springer.Retrievedfromhttp://arxiv.org/abs/0905.2404
Fornasier,Massimo,&Schnlieb,C.-B.(2007).Subspacecorrectionmethodsfortotalvariationandminimization.Physics,47(5),33.SIAM.Retrievedfromhttp://arxiv.org/abs/0712.2258

21

Galatsanos,N.(2008).Amajorization-minimizationapproachtototalvariationreconstructionofsuperresolvedimages.Stat,(30),2-6.
Hu,Y.,&Jacob,M.(2012).Higherdegreetotalvariation(HDTV)regularizationforimagerecovery.
IEEE Transactions on Image Processing,21(5),2559-71.doi:10.1109/TIP.2012.2183143
Hutter,M.(2009).DiscreteMDLpredictsintotalvariation.(Y.Bengio,D.Schuurmans,J.Lafferty,C.
K.I.Williams,&A.Culotta,Eds.)Advances in Neural Information Processing Systems,(x),1-9.
CurranAssociates.Retrievedfromhttp://eprints.pascal-network.org/archive/00005838/
Jaggi,M.(2010).Asimplealgorithmfornuclearnormregularizedproblems.In Practice,(X),471-478.
Citeseer.Retrievedfrom
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.167.4776&amp;rep=rep1&amp;type=pdf
Jin,B.,&Lorenz,D.(2010).Heuristicparameter-choicerulesforconvexvariationalregularizationbased
onerrorestimates.Program,48(3),1-24.SIAM.Retrievedfromhttp://arxiv.org/abs/1001.5346
Kang,M.G.,&Katsaggelos,A.K.(1995).Generalchoiceoftheregularizationfunctionalinregularized
imagerestoration.IEEE Transactions on Image Processing,4(5),594-602.Retrievedfrom
http://www.ncbi.nlm.nih.gov/pubmed/18290009
Knoll,F.,Bredies,K.,Pock,T.,&Stollberger,R.(2011).Secondordertotalgeneralizedvariation(TGV)
forMRI.Magnetic Resonance in Medicine,65(2),480-491.WileyOnlineLibrary.
doi:10.1002/mrm.22595
Koko,J.,&Jehan-Besson,S.(2010).AnaugmentedLagrangianmethodforTVG+L1-norm
minimization.Journal of Mathematical Imaging and Vision,38(3),182-196.SpringerNetherlands.
doi:10.1007/s10851-010-0219-1
Lasenby,A.N.,Barreiro,R.B.,&Hobson,M.P.(2001).Regularizationandinverseproblems.(J.
Skilling,Ed.)Inverse Problems,375,15.Kluwer.Retrievedfromhttp://arxiv.org/abs/astroph/0104306
Lee,S.-H.L.S.-H.,&Kang,M.G.K.M.G.(2007).Totalvariation-basedimagenoisereductionwith
generalizedfidelityfunction.IEEE Signal Processing Letters(Vol.14,pp.832835).IEEE.
doi:10.1109/LSP.2007.901697
Li,Y.,&Santosa,F.(1996).Acomputationalalgorithmforminimizingtotalvariationinimage
restoration.IEEE Transactions on Image Processing,5(6),987-995.Retrievedfrom
http://www.ncbi.nlm.nih.gov/pubmed/18285186
Lv,J.,&Fan,Y.(2009).Aunifiedapproachtomodelselectionandsparserecoveryusingregularized
leastsquares.Annals of Statistics,37(6A),3498-3528.Retrievedfrom
http://arxiv.org/abs/0905.3573
Marquina,A.,&Osher,S.J.(2008).Imagesuper-resolutionbyTV-regularizationandBregmaniteration.
Journal of Scientific Computing,37(3),367-382.doi:10.1007/s10915-008-9214-8

22

Michailovich,O.V.(2011).Aniterativeshrinkageapproachtototal-variationimagerestoration.IEEE
Transactions on Image Processing,20(5),1281-1299.IEEE.Retrievedfrom
http://arxiv.org/abs/0910.5002
Mohammad-Djafari,A.(2001).Bayesianinferenceforinverseproblems.(R.L.Fry,Ed.)AIP Conference
Proceedings,617,477-496.AIP.Retrievedfromhttp://arxiv.org/abs/physics/0110093
Molina,R.,Vega,M.,&Katsaggelos,A.K.(2007).FromglobaltolocalBayesianparameterestimation
inimagerestorationusingvariationaldistributionapproximations.2007 IEEE International
Conference on Image Processing(Vol.1).doi:10.1109/ICIP.2007.4378906
Morgan,S.P.,&Vixie,K.R.(2006).L1-TVcomputestheflatnormforboundaries.Abstract and
Applied Analysis,2007,1-19.Retrievedfromhttp://arxiv.org/abs/math/0612287
Ng,M.K.,Shen,H.,Lam,E.Y.,&Zhang,L.(2007).Atotalvariationregularizationbasedsuperresolutionreconstructionalgorithmfordigitalvideo.EURASIP Journal on Advances in Signal
Processing,2007,1-17.doi:10.1155/2007/74585
Oliveira,J.P.,Bioucas-Dias,J.M.,&Figueiredo,M.A.T.(2009).Adaptivetotalvariationimage
deblurring:Amajorizationminimizationapproach.Signal Processing,89(9),1683-1693.
doi:10.1016/j.sigpro.2009.03.018
Osher,S,Sole,A.,&Vese,L.(2003).Imagedecomposition,imagerestoration,andtexturemodeling
usingtotalvariationminimizationandtheH-1norm.Proceedings 2003 International Conference on
Image Processing Cat No03CH37429(Vol.1,p.3).Ieee.doi:10.1109/ICIP.2003.1247055
Osher,Stanley,Burger,M.,Goldfarb,D.,Xu,J.,&Yin,W.(2005).Aniterativeregularizationmethod
fortotalvariation-basedimagerestoration.(W.G.Lehnert&M.H.Ringle,Eds.)Multiscale
Modeling Simulation,4(2),460.SIAM.doi:10.1137/040605412
Osher,Stanley,Sol,A.,&Vese,L.(2003).Imagedecompositionandrestorationusingtotalvariation
minimizationandtheH1.Multiscale Modeling Simulation,1(3),349.Citeseer.
doi:10.1137/S1540345902416247
Press,W.H.,Teukolsky,S.A.,Vetterling,W.T.,&Flannery,B.P.(2007).NumericalRecipesSource
CodeCD-ROM3rdEdition:TheArtofScientificComputing.Retrievedfrom
http://dl.acm.org/citation.cfm?id=1388393
Rodrguez,P.,&Wohlberg,B.(2009).Efficientminimizationmethodforageneralizedtotalvariation
functional.IEEE Transactions on Image Processing,18(2),322-332.
doi:10.1109/TIP.2008.2008420
Rosasco,L.,Santoro,M.,Mosci,S.,Verri,A.,&Villa,S.(2010).Aregularizationapproachtononlinear
variableselection.Proceedings of the International Conference on Artificial Intelligence and
Statistics,9,653-660.Retrievedfrom
http://jmlr.csail.mit.edu/proceedings/papers/v9/rosasco10a/rosasco10a.pdf

23

Sakurai,M.,Kiriyama,S.,Goto,T.,&Hirano,S.(2011).Fastalgorithmfortotalvariationminimization.
2011 18th IEEE International Conference on Image Processing(pp.1461-1464).IEEE.Retrieved
fromhttp://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6115718
Scales,J.A.,&Gersztenkorn,A.(1988).Robustmethodsininversetheory.(J.A.Scales,Ed.)Inverse
Problems,4(4),1071-1091.Soc.ofExpl.Geophys.doi:10.1088/0266-5611/4/4/010
Sidky,E.Y.,Chartrand,R.,Duchin,Y.,Ullberg,C.,&Pan,X.(2010).Highresolutionimage
reconstructionwithconstrained,total-variationminimization.IEEE Nuclear Science Symposuim
Medical Imaging Conference,(2),2617-2620.IEEE.Retrievedfromhttp://arxiv.org/abs/1104.0909
Sidky,E.Y.,Duchin,Y.,Ullberg,C.,&Pan,X.(2010).Aconstrained,total-variationminimization
algorithmforlow-intensityX-rayCT.Medical Physics,38(S1),S117.Retrievedfrom
http://arxiv.org/abs/1011.4630
Slesareva,N.,Bruhn,A.,&Weickert,J.(2005).Opticflowgoesstereo: avariationalmethodfor
estimatingdiscontinuity-preservingdensedisparitymaps.German Conference on Pattern
Recognition (DAGM),Vol.1(pp.33-40).
Strong,D.M.,Chan,T.F.,&R.(1996).Exact Solutions to Total Variation Regularization Problems.
UCLA CAM Report.Retrievedfrom
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.51.798
Stuart,A.M.(2010).Inverseproblems:ABayesianperspective.Acta Numerica,19(1),451-559.
CambridgeUniversityPress.doi:10.1017/S0962492910000061
Tao,M.I.N.,&Yang,J.(2009).Alternatingdirectionalgorithmsfortotalvariationdeconvolutionin
imagereconstruction.TR0918 Department of Mathmatics,(x),1-17.Retrievedfrom
http://www.optimization-online.org/DB_FILE/2009/11/2463.pdf
Tarantola,A.(2005).Inverseproblemtheory.(Siam,Ed.)Physica B: Condensed Matter(Vol.130,pp.
77-78).SIAM.doi:10.1137/1.9780898717921
Tsai,C.-L.,&Chien,S.-Y.(2011).NewoptimizationschemeforL2-normtotalvariationsemi-supervised
imagesoftlabeling.2011 18th IEEE International Conference on Image Processing(pp.33693372).IEEE.
Wang,Yang,&Zhou,H.(2006).Totalvariationwavelet-basedmedicalimagedenoising.International
Journal of Biomedical Imaging,2006,1-6.HindawiPublishingCorporation.Retrievedfrom
http://www.hindawi.com/journals/ijbi/2006/089095/abs/
Wedel,A.,Pock,T.,Zach,C.,Bischof,H.,&Cremers,D.(2009).AnimprovedalgorithmforTV-L1
OpticalFlow.(D.Cremers,B.Rosenhahn,A.L.Yuille,&F.R.Schmidt,Eds.)Statistical and
Geometrical Approaches to Visual Motion Analysis,1(x),23-45.Springer.doi:10.1007/978-3-64203061-1_2
Weiss,P.,Blanc-Feraud,L.,&Aubert,G.(2009).Efficientschemesfortotalvariationminimization
underconstraintsinimageprocessing.SIAM Journal on Scientific Computing,31(3),2047.
doi:10.1137/070696143

24

Wen,Y.,&Chan,R.(2011).Parameterselectionfortotalvariationbasedimagerestorationusing
discrepancyprinciple.IEEE Trans Image Process,(2),1-12.doi:10.1109/TIP.2011.2181401
Wen,Y.-W.W.Y.-W.,Ng,M.K.,&Huang,Y.-M.H.Y.-M.(2008).Efficienttotalvariation
minimizationmethodsforcolorimagerestoration.IEEE Transactions on Image Processing,17(11),
2081-2088.doi:10.1109/TIP.2008.2003406
Wohlberg,B.,&Rodriguez,P.(2007).An iterativelyreweightednormalgorithmforminimizationof
totalvariationfunctionals.IEEE Signal Processing Letters(Vol.14,pp.948-951).
doi:10.1109/LSP.2007.906221
Yin,W.,Goldfarb,D.,&Osher,S.(2005).Imagecartoon-texturedecompositionandfeatureselection
usingthetotalvariationregularizedL1functional.Variational Geometric and Level Set Methods in
Computer Vision,3752(05-47),73-84.Springer.RetrievedAugust29,2014,from
http://www.springerlink.com/index/D3JNF25VRHPXLUN1.pdf
Zach,C.,Pock,T.,&Bischof,H.(2007).AdualitybasedapproachforrealtimeTV-L1opticalflow.(F.
A.Hamprecht,C.Schnrr,&B.Jhne,Eds.)Computer,1(1),214223.Springer-Verlag.
doi:10.1007/978-3-540-74936-3_22
Zeng,T.,&Ng,M.K.(2010).Onthetotalvariationdictionarymodel.IEEE Transactions on Image
Processing.doi:10.1109/TIP.2009.2034701

S-ar putea să vă placă și