There are 4,911,458 English language articles in Wikipedia. Today’s featured article for July 8, 2015, is Radiocarbon dating. Therein we find the Shroud of Turin mentioned in the fourth paragraph of the introduction and again, many dozens of paragraphs later, in the next to last paragraph under Use in Archaeology.
The Shroud of Turin was the featured article in Wikipedia on Christmas Day, December 25, 2004.
Out of 4, 911,458 articles what are the chances of that?
“… the issue of organic reactions and non-contemporaneous contamination of ancient materials can be a very serious and complex matter, deserving quantitative investigation of the possible impacts on measurement accuracy…”
http://nvlpubs.nist.gov/nistpubs/jres/109/2/j92cur.pdf
I will not be impressed by big numbers…
The enormous interest in radiocarbon data must not make us forget that this is a kind of destructive examination. Also, speaking from a strictly religious point of view, the fact of burning a relic is certainly blasphemy (… unless it’s a kind of medieval ordeal …). There are control systems nondestructive feasible with the use of appropriate techniques (leaving aside the “small destructions” … made by Fanti and Malfi).
So, the question “to which we must pay attention” is the following:
Why we do not want to use those ways of fairly reproducible controls?
Link:
https://en.wikipedia.org/wiki/Trial_by_ordeal
>…In medieval Europe, like trial by combat, trial by ordeal
was considered a judicium Dei: a procedure based on
the premise that God would help the innocent by
performing a miracle on their behalf. …
So. Moral of the story …
Try to see what is the number (English language articles)
about:
AFM, CFM, AFM-Raman, ATR-FTIR, ATR Raman, or SNOM Raman, etc. …!
Here another reference about the argument:
“ATR-FTIR spectroscopy and Raman spectroscopy”
“Dead Sea Scroll parchments were produced by different techniques”
Published: Jul 25, 2013
Author: Steve Down
>Spectroscopic examination of fragments of the Dead Sea Scrolls has shown that the parchments were originally produced by at least two different techniques, contrary to popular belief, say a duo of scientists in Germany.
>Ira Rabin and Oliver Hahn from the BAM Federal Institute for Materials Research and Testing have studied scroll fragments which were discovered at four sites up to 50 km from the Qumran Cave, where more than 90% of the known fragments were found, as they discussed in Analytical Methods.
>Using a combination of X-ray fluorescence, ATR-FTIR spectroscopy and Raman spectroscopy, they were able to distinguish between contaminants originating from parchment production, which were distributed throughout the material, and contaminants from the caves where they were stored. … etc. …
Links:
http://www.spectroscopynow.com/details/blog/140166aea2d/Dead-Sea-Scroll-parchments-were-produced-by-different-techniques.html?&tzcheck=1
http://pubs.rsc.org/en/Content/ArticleLanding/2013/AY/c3ay41076e#!divAbstract
http://pubs.rsc.org/en/journals/articlecollectionlanding?sercode=ay&themeid=4ce476b3-6472-4010-b263-c6632da4ba5b
— *** —
Here another study (= “Dalí’s paintings and cellulosic DP”):
“Looking beneath Dalí’s paint: non-destructive canvas analysis”
by
Marta Oriola, Alenka Možir, Paul Garside, Gema Campo, Anna Nualart-Torroja, Irene Civil, Marianne Odlyha, May Cassar and Matija Strlič
Anal. Methods, 2014,6, 86-96
From themed collection Molecular Analysis for Art, Archaeometry and Conservation
>A new analytical method was developed to non-destructively
determine pH and degree of polymerisation (DP) of cellulose
in fibres in 19th–20th century painting canvases, and to identify
the fibre type: cotton, linen, hemp, ramie or jute.
>The method is based on NIR spectroscopy and
multivariate data analysis, while for calibration and
validation a reference collection of 199 historical canvas samples was used.
>The reference collection was analysed destructively using microscopy and chemical analytical methods.
>Partial least squares regression was used to build quantitative methods to determine pH and DP, and linear discriminant analysis was used to determine the fibre type.
>To interpret the obtained chemical information, an expert assessment panel developed a categorisation system to discriminate between canvases that may not be fit to withstand excessive mechanical stress, e.g. transportation.
>The limiting DP for this category was found to be 600.
>With the new method and categorisation system, canvases of 12 Dalí paintings from the Fundació Gala-Salvador Dalí (Figueres, Spain) were non-destructively analysed for pH, DP and fibre type, and their fitness determined, which informs conservation recommendations. >The study demonstrates that collection-wide canvas condition surveys can be performed efficiently and non-destructively, which could significantly improve collection management.
I am curious about this work because Diana and Marinelli
indicated the degree of polymerisation (DP) of cellulose
as an important parameter and then I suggested to use
the SPM techniques in order to know more about linen fibrils
already taken from the Shroud…
Unfortunately we can read:
>… The reference collection was analysed destructively using microscopy and chemical analytical methods. …
… destructively???
B.T.W.:
… Do you know Partial least squares (= PLS) regression?
Piero: I recall writing a multivariate regression programme way back in the mid-1970s on one of the first HP Programmable Calculators using an early form of Basic. I have not heard it referred to as “Partial Least Squares” before, and I have not heard of this as a form of analysis distinct from “Least Squares”. I might guess that it could get its name from taking the Partial Derivatives of the Least Square equations on each of the multiple variables to obtain the coefficients and equating the various partial derivatives to zero to minimise the various least square sums.
If you are familiar with ordinary multivariate regression analysis, I’d be inclined to use that, if not, then consult any standard text on Statistics and Probability theory, most of which will include a chapter on MVR.
Very tersely: Let there be n sets of data points in k independent variables x(j) where the x(j) are measured about their sample mean values. Then the objective is to obtain an estimate y’ of the dependent variable y about its mean value. Geometrically the problem is one of finding the equation of the plane which fits best, in the sense of least squares, a set of points in k + 1 dimensions. The equation will be of the form: y’ = Sum [ a(j) . x(j)] (j = 1 to k) where a(j) is the coefficient to be found for each x(j).
Taking the partial differentials of the sum of the least squares expression yields k separate equations which must be solved to obtain each of the a(j).
Equation j will then be of the form: Sum [ a(i) . Sum [ x(j) . x(i) ] ](i = 1 to k) = Sum [x(j) . y]
Nowadays, one would use one of the multivariate spreadsheet packages which include matrix analysis such as that in MS Excel, and it’s all a lot easier. But check out the technique in a textbook first if you’re not familiar with it.
Regarding the earlier destructive tests of the Dali paintings, I would presume that only very small samples of the canvasses were destroyed in the test, and I would hope that the main works were not too badly damaged.
First of all, with regard to destructive tests,
when you wrote:
“… Regarding the earlier destructive tests of the Dali paintings, I would presume that only very small samples of the canvasses were destroyed in the test, and I would hope that the main works were not too badly damaged…”.
I believe that with the generic term (= “destructive”), in reality, they are referring to measures around the viscosity (and these are measures that [first of all] require the dissolution of the cellulosic material to be examined and then the measurement of the viscosity, which then it is correlated [or referable] to the degree of polymerization of the cellulose … unfortunately when there are present certain impurities [for example: TiO2 = Titanium dioxide, in the viscose rayon, etc.], the viscosimetric test fails to provide an exact answer […or you have to calculate the inherent differences due to the pollution]!).
See, for example, the norm:
UNI 8282:1994
= Cellulose In Dilute Solutions – Determination Of Limiting Viscosity Number – Method In Cupri-ethylene-diamine (ced) Solution
or
Standard Number: UNI UNI 8282-1994
Title: Cellulose in dilute solutions. Determination of limiting viscosity number. Method in cupri-ethylene- diamine (CED) solution. Cellulosa in soluzioni diluite. Determinazione dell’indice della viscositlimite. Metodo che usa una soluzione di cuprietilendiammi
Language: Italian
Publication Date: 1994/10/31
Publisher: Unifica zione Italian no(UNI)
Link:
http://www.freestd.us/soft4/1756344.htm
— — —
Thank you for the answer on the matter of Statistics….
In any case I hope to further deepen the “PLS” argument…
Link:
https://en.wikipedia.org/wiki/Partial_least_squares_regression
>… The PLS algorithm is employed in partial least squares path modeling. …
— —
B.T.W.:
Do you know the methods of measurement for evanescent wave penetration (in ATR techniques)?
Thank you for the link to the PLS page. It is a technique I am not familiar with. I have some difficulty in seeing how it can be at all robust. It claims to be most effective when the number of variates exceeds the number of data points. Example for instance: I cannot yet see how the technique would yield more information if there were only two data points in the three variates y versus u, v, beyond giving a straight line between the two data points. But I don not intend to pursue it any further.
I have found a lot of material on PLS…
You wrote:
“…I cannot yet see how the technique would yield more information if there were only two data points in the three variates y versus u, v, beyond giving a straight line between the two data points. …”
I think we can obtain a lot of data points using proper techniques.
Am I wrong?
B.T.W.:
Today I have seen that in the italian translation of the book by Sheldon Ross it’s printed a bit different set of “Data Points” about a Simple Linear Regression
(there are indicated only 8 points from the list of Data Points, with a “Sample size” = 15) and there are little differces for set, but … with the same least square estimators:
a= -2.51
b= 0.32
“average x value” (= 46.13)
“Sum of squares of the x values” (= 33212.0)
“estimated regression line” (Y = -2.51 + 0.32x)…
The example used to compute the least square estimators was inherent to measurements on
relative humidity and moisture content.
— —
Here the following description,
in Wikipedia:
>Partial least squares regression (PLS regression) is a statistical method that bears some relation to principal components regression; instead of finding hyperplanes of minimum variance between the response and independent variables, it finds a linear regression model by projecting the predicted variables and the observable variables to a new space. Because both the X and Y data are projected to new spaces, the PLS family of methods are known as bilinear factor models. … etc. …
>… PLS regression is particularly suited when the matrix of predictors has more variables than observations, and when there is multicollinearity among X values. By contrast, standard regression will fail in these cases (unless it is regularized). … etc. … … etc. …
Link:
https://en.wikipedia.org/wiki/Partial_least_squares_regression
— —
Another argument:
Partial Least Squares Regression and Principal Components Regression.
>This example shows how to apply Partial Least Squares Regression (PLSR) and Principal Components Regression (PCR), and discusses the effectiveness of the two methods. PLSR and PCR are both methods to model a response variable when there are a large number of predictor variables, and those predictors are highly correlated or even collinear. Both methods construct new predictor variables, known as components, as linear combinations of the original predictor variables, but they construct those components in different ways. PCR creates components to explain the observed variability in the predictor variables, without considering the response variable at all. On the other hand, PLSR does take the response variable into account, and therefore often leads to models that are able to fit the response variable with fewer components. Whether or not that ultimately translates into a more parsimonious model, in terms of its practical use, depends on the context.
— —-
>… These data are described in Kalivas, John H., “Two Data Sets of Near Infrared Spectra,” Chemometrics and Intelligent Laboratory Systems, v.37 (1997) pp.255-259.
Link:
http://it.mathworks.com/help/stats/examples/partial-least-squares-regression-and-principal-components-regression.html
—- —
The Collinearity Problem in Linear Regression. The Partial Least Squares (PLS) Approach to Generalized Inverses
SIAM J. Sci. and Stat. Comput., 5(3), 735–743. (9 pages)
>The use of partial least squares (PLS) for handling collinearities among the independent variables X in multiple regression is discussed.
>Consecutive estimates are obtained using the residuals from previous rank as a new dependent variable y.
>The PLS method is equivalent to the conjugate gradient method used in Numerical Analysis for related problems.
>To estimate the “optimal” rank, cross validation is used. Jackknife estimates of the standard errors are thereby obtained with no extra computation.
>The PLS method is compared with ridge regression and principal components regression on a chemical example of modelling the relation between the measured biological activity and variables describing the chemical structure of a set of substituted phenethylamines.
Link:
http://epubs.siam.org/doi/abs/10.1137/0905052
— —
As you have seen I am not an expert in Statistics!
Another thing:
I was not completely sure that Dalì paintings were examined using the viscosimetric method (similar what was indicated
in the past [several years ago] by Diana and Marinelli,
in order to detect the cellulosic DP, etc. …).
But I was lucky to deepen the issue and I have found that they used:
>…The viscometric method was used to determine the DP of cellulose.
>Although sample consumption is considerable ([similar]30 mg), local heterogeneity is thus avoided leading to lower uncertainty.
>DP was calculated from intrinsic viscosity [η], … … etc. ….
Link:
http://pubs.rsc.org/en/content/articlehtml/2014/ay/c3ay41094c
>Several methods of sample preparation were tested, as viscometry has not been used for determination of the DP of such a variety of real canvas samples before. If present, the primer layer was manually removed, following which the samples were soaked and washed to remove gelatine, dried and defibred.
>The DP of 95 samples was determined.
Obviously we cannot destroy 30 mg (or more) of linen fibrils coming from the Shroud (using the viscometric way)!…
— — —
Apart the ATR-FTIR there are other ways:
Super-Resolution Microscopy: Principles, Techniques, and Applications (Date: 21 Jan 2014)
by Sinem K. Saka
Here an excerpt from the
Abstract
>Diffraction sets a physical limit for the theoretically achievable resolution; however, it is possible to circumvent this barrier. That’s what microscopists have been doing in recent years and in many ways at once, starting the era of super-resolution in light microscopy. High-resolution approaches come in various flavors, and each has specific advantages or disadvantages. … …
= AFM, SNOM, etc.
Link:
http://link.springer.com/protocol/10.1007/978-1-62703-983-3_2
in “Neuromethods”, Volume 86 2014
= Super-Resolution Microscopy Techniques in the Neurosciences
Editors: Eugenio F. Fornasiero, Silvio O. Rizzoli
Link:
http://link.springer.com/book/10.1007/978-1-62703-983-3
Links for a comprehensive review of the single-molecule localization microscopy (SMLM) software packages:
http://bigwww.epfl.ch/palm/
http://bigwww.epfl.ch/smlm/methods/index.html
Here another chapter:
Data Analysis for Single-Molecule Localization Microscopy
(Date: 21 Jan 2014)
by Steve Wolter, Thorge Holm, Sebastian van de Linde, Markus Sauer
Link:
http://link.springer.com/protocol/10.1007/978-1-62703-983-3_6
STORM, PALM and fPALM:
http://bitesizebio.com/13434/storm-palm-and-fpalm-the-alphabet-soup-of-super-resolution-light-microscopy/
I have found:
“A beginner’s guide to partial least squares analysis.”
by
Haenlein, M., & Kaplan, A. (2004).
Understanding Statistics, 3(4), 283 — 297.
As available at:
http://www.stat.umn.edu/~sandy/courses/8801/articles/pls.pdf
and
I have also found another paper that can be interesting about the PLS…
Title:
“Correlation coefficient optimization in partial least-squares regression with application to ATR-FTIR spectroscopic analysis.”
by
Yifang Chen, Jiemei Chen, Tao Pan, Yun Hana and Lijun Yaoa
Anal. Methods, 2015,7, 5780-5786
Link:
http://pubs.rsc.org/en/content/articlelanding/2015/ay/c5ay00441a#!divAbstract
Abstract:
>A wavelength selection method for spectroscopic analysis, named correlation coefficient optimization coupled with partial least-squares (CCO-PLS), is proposed, and was successfully employed for reagent-free ATR-FTIR spectroscopic analysis of albumin (ALB) and globulin (GLB) in human serum. By varying the upper bound of correlation coefficient between absorbance and analyte’s content, the CCO-PLS method achieved multi-band selection. Two PLS-based methods, which used a waveband having positive peaks of the first loading vector (FLV) and a combination of positive peaks of the correlation coefficient spectrum, were also conducted for comparison. Based on the leave-one-out cross-validation for CCO-PLS, appropriate waveband combinations for ALB and GLB were selected, the root-mean-square errors of prediction for validation samples were 1.36 and 1.35 (g L−1) for ALB and GLB, respectively, which were better than the two comparison methods. The CCO-PLS method provided a new approach for multi-band selection to achieve high analytical accuracy for molecular absorption bands that were composed of several spaced wavebands.
— — —
Dear “daveb of wellington nz”,
perhaps our friend Colin knows the article:
“PLS regression methods”
by
Agnar Höskuldsson.
It’s an article first published online: 30 Mar 2005.
Original article in “Journal of Chemometrics”,
Volume 2, Issue 3, pages 211–228, June 1988
Abstract
>In this paper we develop the mathematical and statistical structure of PLS regression. We show the PLS regression algorithm and how it can be interpreted in model building. The basic mathematical principles that lie behind two block PLS are depicted. We also show the statistical aspects of the PLS method when it is used for model building. Finally we show the structure of the PLS decompositions of the data matrices involved.
— —
B.T.W.:
Do you know “Kernel principal component analysis”?
From Wikipedia, the free encyclopedia:
>In the field of multivariate statistics, kernel principal component analysis
(= kernel PCA) is an extension of principal component analysis (PCA)
using techniques of kernel methods.
>Using a kernel, the originally linear operations of PCA are done
in a reproducing kernel Hilbert space with a non-linear mapping. … …
Links:
https://en.wikipedia.org/wiki/Kernel_principal_component_analysis
http://www.sciencedirect.com/science/article/pii/S002228600600322X
———————-
Unfortunately I turned the discourse in a drifted “analytical journey”,
very far away from the argument : “14C test, etc….”!