- Notifications
You must be signed in to change notification settings - Fork0
Image denoising implemented in tensorflow
License
D-K-E/image-denoising-tf
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Image denoising algorithms implemented in tensorflow 2
Thedeep image prior algorithm isimplemented in tensorflow 2.
Command line usage:
usage: python deepimprior.py ./data/im03.png --outpath ./data/outimages/ --outprefix out_denoised --verbose 1 --epochs 10000 --learning_rate 0.1 --save_model_path ./data/outmodels/model --period 20Denoise a given image using deep image prior algorithm Beware of the followingissues before proceeding with the usage of this script: - Convolution basedalgorithms are sensible to image size. Please use a square image. Ex: 800x800,or 600x600. - Image size significantly effects the training. Either make sureyou have enough computation power, or adjust the image size appropriately. -Lastly as with all the gradient based methods we are using a stable learningrate. Feel free to adjust it before the training phase. I am thinking ofadding decay learning rate optionin the future..positional arguments: imagepath path to the imageoptional arguments: -h, --help show thishelp message andexit --outpath OUTPATH pathfor saving outputs --outprefix OUTPREFIX prefix that will e prepended to intermediate files --epochs EPOCHS number of training epochs --verbose {0,1} verbose output during training --learning_rate LEARNING_RATE learning ratefor the optimizer --save_model_path SAVE_MODEL_PATH Save model to path at each period of epochs --load_model_path LOAD_MODEL_PATH Load model from path to resume training --period PERIOD Periodic activity number, saving images, models etc at the end of each period/epoch number
Several use cases are implemented:
- If you want to reuse the object that encapsulates the options covered in thecommand line usage in another setting just import the
DeepImPriorManager
object. Here is how to do it:
fromdeepimpriorimportDeepImPriorManagerfromPILimportImageverbose=Trueimage_path="./data/my_noisy_image.png"period=200# interval of epochs, used for scheduling callbackslearning_rate=0.01# the value is taken from the paperepochs=2400# the value is taken from the paperout_folder="./data/outimages"out_prefix="my_denoised_"save_model_path="./data/outmodels/model"# save model here at the end of a period# if you want to resum trainingload_model_path:Optional[str]="./data/outmodels/model_1000"# in verbose output you can save the model plot and its summary to a fileplot_path="plot_model.png",summary_path="model_summary.txt",deep_prior=DeepImPriorManager(noisy_image=Image.open(image_path),verbose=verbose,period=period,learning_rate=learning_rate,epochs=epochs,out_folder=out_folder,out_prefix=out_prefix,save_model_path=save_model_path,load_model_path=load_model_path,plot_path=plot_path,summary_path=summary_path)# fits the model and saves it to "out_folder/outp_prefix-result.png"deep_prior.run_save()# or you can just fit the model and do something else with it like# evaluation etc.deep_prior.run()model=deep_prior.model# do other stuff with the model
If you want rebuild the architecture used in the paper, theu_i
,d_i
ands_i
functions implement the components of the architecture implied in thefigure 21 from page 19. They output a list of layers. No input shape is givenduring their creation. We diverge from the paper in padding. The paper usesreflect padding. Since the keras api does not implement reflect padding, weuse the "same" padding option instead.
For evaluation of list of layers that result from the above mentionedfunctions, we had writtenapply_layers
function. The function is verysimple:
defapply_layers(inlayer,lst:List[tf.keras.layers.Layer]):"""!\brief apply layers consecutively \param inlayer input either a result of a previous evaluation or tf.keras.layers.Input. Notice that it is not tf.keras.layers.InputLayer """x=inlayerforlayerinlst:x=layer(x)returnx
About
Image denoising implemented in tensorflow
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Contributors3
Uh oh!
There was an error while loading.Please reload this page.