Introduction
Structural Masonry is a composite material made of units e.g. ceramic blocks and mortar, which, according to NBR 152702 (^{ABNT, 2005}), can withstand a stress of at least 3 MPa. The Brazilian standard considers prism compressive strength as the most important criterion to determine the characteristic compressive strength of the masonry (f_{k}). A prism consists of two ceramic blocks, with a bed joint of about 10 mm and its compressive strength is determined by applying a load, at constant velocity, until 50% of the failure load and then increasing this velocity so that the failure occurs within 1 to 2 minutes. Alternatively, current international standards (^{EUROPEAN…, 2005}; ^{JOINT…, 2001}) propose semiprobabilistic models, which correlate the compressive strength of masonry to the resistances of the block and mortar components, through a power function.
These empirical models have been successfully implemented and enhanced by several authors. ^{Sýkora and Holický (2010)} applied a probabilistic model based on destructive and nondestructive tests to determine the strength of historical masonry structures. ^{Mann (1982)}, ^{Dayaratnam (1987)} and ^{Kaushik, Rai and Jain (2007)} proposed similar equations to the one presented by EN 199611 (^{EUROPEAN…, 2005}). ^{Dymiotis and Gutlederer (2007)} proposed a series of new models based on regression analysis involving the mortar and brick compressive strengths and several other parameters relating to the geometry of the brick units. ^{Nagel, Mojsilovic and Sudret (2015)} characterized compressive strength as a random variable, based on experimental data collected from 20092012, concluding that a lognormal distribution is well fitted to describe masonry compressive strength variable.
In general, it can be said that the resistance mechanism of masonry subjected to compression depends basically on the interaction between the blocks and the mortar. In this sense, most codes tend to be conservative, as they consider only the mechanical properties of materials, while overlooking qualitative parameters (such as the construction process and uncertainties in the production of the blocks). Mortar joint thickness, for instance, is governed by workmanship, and therefore can widely vary in a single building, thus impacting the compressive strength of masonry. According to ^{Francis, Horman and Jerrems (1981)}, if a bed joint thickness is increased from 10 to 25 mm, the average compressive capacity of the masonry is reduced by 25%. Moreover, the dimensional parameters will also have a significant impact in the construction process. Blocks may be damaged during packaging, shipping or on the building site, thinner web and shell thickness, may increase the probability of damaging blocks, thus generating waste, while bowed blocks will increase the consumption of mortar to minimize the warpage.
According to ^{Ramalho and Corrêa (2003)}, a series of parameters should be taken into consideration before selecting structural masonry as a constructive system, otherwise this may become an expensive alternative to traditional reinforced concrete structures. The blocks currently been produced in Brazil are appropriate for the construction of 16storey buildings, as the height of the building increases grouted masonry is required, thus having a great impact in the economic competitiveness of this system. Even though structural masonry significantly reduces the consumption and waste of materials, there are a few disadvantages in selecting this constructive system. ^{Ramalho and Côrrea (2003)} and ^{Roman, Mutti and Araújo (1999)} mention the difficulties on adapting the architectural project to new uses (restrictions on space versatility especially in commercial buildings), the interferences between architectural, hydraulic and electric projects and the need of skilled labor. Therefore, the adoption of modular coordination during the design phase and lean manufacturing in the building site is essential for the competitiveness and rationalization of the construction process. Also, selecting a manufacturer that is in conformity with industry wide policies for quality control should be taken into consideration by the builder.
The masonry sector in Brazil comprises of 6,903 industrial units, mostly small and medium sized enterprises run as family businesses, with an annual revenue of over 18 billion reais. Over the years, the National Association of the Ceramic Industry (ANICER) implemented a series of initiatives to improve its quality and efficiency, been the most significant, the Sector Program of Quality (PSQ). It requires that members enhance quality control levels on dimensional and mechanical parameters, aiming to standardize the quality, aggregate commercial value and enhance security levels, while also promoting lean manufacturing.
This paper addresses a study on the quality control of mechanical and dimensional parameters, by using statistical inference procedures, with the random variable distribution model being adjusted with a nonparametric test (KolmogorovSmirnov). As the focus of this paper is to assess the uncertainties in the production of the blocks and its consequent influence on the compressive strength of masonry, international standards were adopted, hence the compressive strength formulation proposed by EN 199611 was applied. The referred model is then analyzed in a reliabilitybased framework, with compressive strength of block (f_{b}) as a random variable.
GoodnessofFit test
In order to characterize a random variable, a data set must be fitted by a statistical model, which should describe properly its distribution. Several statistical distributions can fit a data sample so, to determine which model best suits the data, a goodnessoffit (GoF) test can be conducted. This kind of procedure provides a critical value, which represents a maximum admissible error between observations of the sample and a specific analytical model. The test also provides a hypothetical value associated with each candidate distribution, which must be smaller than a critical one, being the smaller hypothetical value an indicative of the best fitting, among candidate models.
The KolmogorovSmirnov test (KS) calculates its h.v. as the maximum distance between the cumulative distribution function (CDF) and the cumulative sample histogram (Figure 1), providing a reliable test statistics, identified by D_{n}, in Eq. 1:
Where:
x_{i} is the i^{th} observed value in the sample of size n, rearranged in increasing order; and
S_{n}(x_{i}) is the cumulative frequency value at x_{i}, computed by i/n.
The CDF of a theoretical candidate model is represented by F_{x}(x_{i}). The KS test is based on the cumulative histogram of the sample, consisting in an advantage over the popular ChiSquare test, which depends on the frequency histogram, being sensitive to the number of intervals (^{ANG; TANG, 2007}).
Critical values for the KS statistics, at a significance level α, are presented in Table 1, depending on the sample size. It is noticed that increasing the significance level adopted implies in a more severe test, which imposes a closer agreement between theoretical model and data sample, bounded by a lower critical value. More details concerning goodnessoffit tests can be found in ^{Ang and Tang (2007)}.
d.o.f. = n  α = 0.20  α = 0.10  α = 0.05  α = 0.01 

5  0.45  0.51  0.56  0.67 
10  0.32  0.37  0.41  0.49 
15  0.27  0.30  0.34  0.40 
20  0.23  0.26  0.29  0.36 
25  0.21  0.24  0.27  0.32 
30  0.19  0.22  0.24  0.29 
35  0.18  0.20  0.23  0.27 
40  0.17  0.19  0.21  0.25 
45  0.16  0.18  0.20  0.24 
50  0.15  0.17  0.19  0.23 
>50  1.07/

1.22/

1.36

1.63/

Source:^{Ang and Tang (2007)}.
Structural reliability analysis
The increasing development of materials and structural modeling, and the consequent growth in the complexity of structures, demands a proper knowledge of safety levels involved in the design phase. The structural reliability theory provides methods to evaluate these safety (or risk) levels, accounting for the uncertainties inherent to the design. In structural engineering applications, the uncertainties commonly verified relate to material properties, such as compressive strength, and dimensional parameters. It refers directly to nonuniformity on the manufacturing process of structural materials and elements. In the light of probability and statistics concepts, these uncertainties are modeled as random variables, and collected together in a framework of mathematical models that estimate the probability of failure associated to a specific failure mode defined by the user. Fundamentals and applications of the structural reliability theory can be found in ^{Melchers (1999)}, ^{Lemaire, Chateauneuf and Mitteau (2009)}, among others.
Essentially, a structural reliability analysis involves a limit state equation (socalled failure function), some random variables and a reliability evaluation method. In general, a limit state equation G(X) represents a performance function, which measures the probability of violation of an Ultimate Limit State (ULS) or a Serviceability Limit State (SLS). G(X) positive values indicate safe events and negative values failure events. The failure modes considered in this paper represent the safety margin of probabilistic compressive strength to be exceeded by the deterministic corresponding strength. A general usual form of limit state equation relates resistance (R) and load (L) terms, as shown in Eq. 2, in which the vector X contains the n random variables associated to the problem.
Some random variables quantify the uncertainties of geometrical/mechanical properties and other design variables related to the structural element, which influence the strength term R in a failure function. On the other hand, the load term L can bring as random variable the selfweight, external mechanical loads or stresses caused by temperature variation, for instance. The correlation between r.v. also can be attached on reliabilitybased problems, although the literature states that adopting the variables as independent is a conservative procedure.
Since the focus of this paper consists on the analysis of compressive strength, only the strength term is going to be assumed as probabilistic, as will be discussed later.
Monte Carlo simulation
The method consists in generating n_{s} random scenarios to be tested in the limit state function, then computing the number of failure events n_{f} (when G(X) ≤ 0), and estimating the failure probability by P_{f} = n_{f}/n_{s}.
The random scenarios are defined by generating n_{s} aleatory values for each random variable assumed in the analysis. Therefore, the statistical characterization of each variable and a random number generator are required. An illustrative example is presented in Figure 2, in which a thousand events are generated. Each event is tested within the limit state function, where if the strength (R) is higher than the load (L), there is a safe event. Otherwise, there is a failure event. In this hypothetical example, R and L are Gaussian distributed variables with means values 115 and 90, and standard deviation equals to 4 and 10, respectively. It is usual to adopt the notation R = N(115; 4) and L = N(90; 10).
By its nature, Monte Carlo provides very accurate results, since an adequate number of simulations are performed. However, this method may have issues with very low failure probabilities once it will need, at least, the inverse of the failure probability number of scenarios to possibly be capable to detect one failure event, i.e., if the problem has a probability of failure equals to 10^{6}, a minimum of 10^{6} scenarios must be generated and simulated. It should be regarded that the estimated P_{f} is also a random variable, assuming different values for each simulation of n_{s} scenarios. Some expressions are proposed to define a minimum number of scenarios that keeps the covariance of P_{f} in a desirable level δ, as the one proposed in Eq. 3.
First Order Reliability Method (FORM)
According to ^{Melchers (1999)}, reliability analysis problems can be expressed by considering a limit state function (G(X) = 0) and the adopted random variables X, being P_{f} exactly evaluated by the integral presented in Eq. 4:
In which f_{x}(x) is the joint probability density function (PDF) of the random variables X. However, depending on the number of random variables, this integral is not easy to solve and numerical approximations should be applied, where Monte Carlo simulation is an option. Transformation methods, such as the First Order Reliability Method (FORM), which is analytically derived and iteratively solved, stand out as an interesting choice. The method consists in transforming all random variables (X) in its corresponding standardized normally distributed ones (Y), this is done by first applying a normal tail approximation and then reducing them to standard normal probability distribution function. It is also necessary to rewrite the limit state function for this standard normal space (G(Y) = 0). In this new space, the concept of probability of failure can be associated with the shortest distance between the new limit state function and the origin of the transformed space. This distance is known as the reliability index β and its relation with P_{f} is provided by Eq. 5.
An advantage of this method is that it can be solved faster than Monte Carlo simulation, regarding a good level of accuracy, in several applications. Moreover, if the limit state function is linear on the random variables, these ones presenting Gaussian distribution, FORM results are exact. Figure 3 illustrates the procedure.
Another information extracted from FORM is the importance factor of each random variable, for the achieved failure probability. This information is associated with the position vector Y* and the partial derivatives of G(Y) at this point. The importance factor measures the influence of each random variable in the aleatory process.
Statistical data characterization
A total of 100 hollow blocks (nominal dimensions W x H x L  14 x 19 x 29 cm), manufactured between February and April 2014, were evaluated for dimensional and mechanical parameters. The data sets are identified accordingly to production month, as FEB14, MAR14 and APR14, containing 35, 35 and 30 blocks, respectively. All blocks were measured, and of those a total of 40 blocks were randomly selected to assess its compressive strength (f_{b}). Both the geometric and mechanical parameters data series were characterized as random variables and adjusted using the KolmogorovSmirnov test. The moments (mean and standard deviation) estimation and GoF test was carried out for the candidate distribution models, by using the inhouse software StatFit (beta). Normal, Lognormal, Fréchet and Weibull models were preselected by using visual inference and due to its wide application in structural engineering.
Each set of blocks was individually characterized but, to describe statistically the quality control of the manufacturer within the evaluation period, a global sample of data was also addressed.
Regarding the dimensional analysis, seven parameters were measured and sampled, as follows: height [H], width [W], length [L], shell thickness [ST], web thickness [WT], rectangularity [R] and straightness of sides [SS].
Statistical parameters for the global sample are presented in Table 2, besides the KS test results. COV denotes the coefficient of variation associated to the sample. Among the four candidate models, special attention was paid to the Normal distribution, due to its popularity and wide application in engineering problems. Hence, it can be noticed that this distribution fits properly four out of seven dimensional variables, presenting a KS maximum error lower than the critical one, considering a 100 elements sample, at a 5% significance level.
INFERENCE  H (mm)  L (mm)  W (mm)  STmin (mm)  WTmin (mm)  R (%)  SS(%) 

Sample Size  100  
Mean  187.50  288.73  134.60  6.08  5.83  2.01  2.63 
Standard Deviation  1.3849  1.3763  1.7497  1.1524  0.9337  0.7075  0.9824 
COV  0.0074  0.0048  0.0130  0.1896  0.1601  0.3515  0.3731 
KS statistic  0.0990  0.0977  0.1307  0.1276  0.1090  0.0557  0.0845 
Normal  Weibull  Fréchet  Normal  Weibull  Normal  Normal  
KS Critical Value  0.1360 
It is worth noting that the mean for height, length and width were smaller than the nominal dimensions expected, this might become a drawback in the adoption of modular coordination. In order to implement this constructive system, which is capable of rationalizing the building site logistics, from the design to the final product, the manufacturer may need to address necessary adjustments with the manufacturing of complementary elements (^{WINTER, 2005}), which will add to the final cost of building. Also, the mean for STmin and WTmin were far below the nominal dimensions (STmin x WTmin  8mm x 7mm), which might imply a high level of brittleness and inadequate transportation by the manufacturer, which might translate in a smaller compressive capacity of blocks.
The distribution of variables [L] and [WTmin] is best represented by a Weibull model, while the width [W] is described by a Fréchet function. The histograms and PDFs presented in Figure 5 aims to elucidate this behavior. Although the normal model was not approved for these random variables, its PDF is plotted next to the approved ones, so that it is possible to visualize the divergence between them. Regarding the histograms of [L] and [WTmin], it can be seen the prevailing of values lower to the histogram peak, which stands a rightmost position. This explains the good performance of Weibull model, which presents a rightshifted peak, with a significant left tail, unlikely the fast descending of left tail on the symmetrical Gaussian distribution. An opposite behavior to this can be observed in the results of [W], which was described by a Fréchet model.
It is confirmed here that the visual inference is not enough to accurately characterize a random variable, since the visual impression is not always in agreement to the GoF result.
The dependence between dimensional parameters, due to the manufacturing process, is evaluated by statistical correlation on some pairs of variables, as presented in Figure 6.
In general, correlation values are not representative, but some values stand out. The one involving the imperfections [R] and [SS], around 12%, indicates a propensity to increase one parameter as the other one increases. The deviation [SS] also presents a negative correlation of 23% with the length of the block [L]. There is another pronounced value, of 19%, between [H] and [W]. This indicates that a no calibrated mold was used, causing geometrically imprecise blocks. Considering a fixed volume of material in the form, it can occur wider and shorter blocks or thinner and taller blocks.
Regarding the compressive strength (f_{b}), some blocks on the mills FEB14, MAR14 and APR14 were tested, and the random variable was characterized by using the GoF test aforementioned. Table 3 presents the statistical parameters and KS test result for the individual and global samples, considering each sample size, at a 5% significance level.
INFERENCE  FEB14  MAR14  APR14  GLOBAL 

Sample Size  15  13  12  40 
Mean (MPa)  7.36  8.72  8.45  8.13 
Standard Deviation (MPa)  1.6934  1.7801  0.9821  1.6281 
COV  0.2301  0.2041  0.1162  0.2003 
KS statistic  0.1359  0.1932  0.1373  0.0893 
Normal  Normal  Normal  Normal  
KS Critical Value  0.3512  0.3772  0.3926  0.2150 
Even though most months had a standard deviation value higher than 1, it is worth noting that overall, the Gaussian distribution fits rightly the four data sets. FEB14 can be identified as the most dispersed data, and APR14 was the best month for f_{b} parameter, due to its lower COV, around half of the other ones. This successive reduction of COV can be an indicative of a progressing on the process control by the manufacturer, indicated also by a high value of mean f_{b}. By collecting the data together, in a global sample, it can be seen a 20% COV, associated to mean value above 8 MPa.
Histograms in Figure 7 reveal the difference between the relative frequency of occurrence on the three individual mills, despite the fact that a Gaussian model describes properly its distributions. The visual analysis is especially inefficient in these cases, considering the reduced sample sizes. The global data set behaves in a more usual way, presenting histogram bins to the right and left of the central region, which are well captured by the Normal tails.
Reliability analysis of compressive strength
The Brazilian code, as previously stated, relies on the execution of prism tests to determine the compressive strength of masonry. Alternatively, this variable can be expressed through empirical models proposed by EN 199611 (^{EUROPEAN…, 2005}) and JCSS (^{JOINT…, 2001}). As no prims were built, equation 3.1 of Eurocode 6 was adopted, as presented in Eq. 6.
From this point on, for sake of completeness, the superscripts α and β are replaced by c1 and c2, as follows (Eq. 7):
In which f_{b} denotes the normalized mean compressive strength of the units, f_{m} denotes the mean compressive strength of the mortar and f_{k} represents the characteristic compressive strength of the masonry. The constants (c1, c2) are defined by the type of mortar and k is defined by the type of block. In this paper, was assumed general purpose mortar and following the classification of Eurocode 6, blocks belong to group 2.
Accordingly, it is proposed to evaluate the probability that the measured compressive strength of masonry (calculated with f_{b} as a random variable) be exceeded by the nominal compressive strength of masonry. It means that this paper is not going to analyze the probability of failure for an ultimate limit state, but it is going to instead, do it for the design equation, which is used in common design practice. This kind of analysis can be useful to verify the influence of the dispersion of variables in the strength model, evaluating the safety level intrinsically associated to it. This failure function is defined by Eq. 8.
The deterministic term f_{k, nom} is calculated using equation 3.2 of Eurocode 6, as follows (Eq. 9):
According to NBR 152702 (^{ABNT, 2005}) the minimum admissible compressive strength value for a clay block to be considered as structural is f_{b} = 3 MPa and NBR 158121 (2010) determines that the compressive strength of the mortar (f_{m}) should be within 1.5 MPa ≤ f_{m} ≤ 0.7 f_{bk}. In the proposed scenario, f_{k,nom} is defined for the minimum admissible value of f_{b} defined by the Brazilian code, while the probabilistic term f_{k,prob} is assessed by Eq. 9, with f_{b} as a random variable, accordingly to Table 3. It should be noted that f_{m} can be adopted with any value, considering that it is a deterministic variable that does not influence the P_{f} obtained.
The failure probabilities, and reliability indices associated, are presented in Table 4. In applying the Monte Carlo method, the number of realizations was chosen to guarantee a 5% coefficient of variation of P_{f} calculated, as previously explained:
MILL  MONTE CARLO  FORM  

ns  P_{f}  P_{f}  
FEB14  80410  4.95 x 10^{3} (β = 2.58)  4.94 x 10^{3} (β = 2.58) 
MAR14  609410  6.56 x 10^{4} (β = 3.21)  6.56 x 10^{4} (β = 3.21) 
APR14  2.857 x 10^{10}  1.40 x 10^{8} (β = 5.55)  1.34 x 10^{8} (β = 5.56) 
GLOBAL  474100  8.43 x 10^{4} (β = 3.14)  8.14 x 10^{4} (β = 3.15) 
It can be seen that the three individual samples presented quite different behaviors. The failure probability tends to 10^{4}10^{3} by using the random variable f_{b} from february and march mills, the same range that is also obtained if the global sampling is considered. It is noticed that mill APR14 has a much lower failure probability than the other two data sets, due to its lower COV and high mean value. It should be noted that FORM results agree with Monte Carlo, at a significantly lower computational cost. The smooth nonlinearity of the limit state Eq. 8, only due to the term f_{b}^{0.7} in f_{k,prob}, contributes to this fact. It is possible that, in strongly nonlinear limit state functions, FORM does not perform so accurately.
The tolerable failure probability values are not unanimity over the scientific/technical community in structural engineering in general. It depends on the class of the structure, the failure cost, among others. The implication of human lives and environmental risks are also determinant aspects on the definition of a required safety level. Recommendations on some normative codes just begin to appear, e.g., the ones based on Joint Committee on Structural Safety suggestions (^{JOINT…, 2001}). In the masonry design sector, it consists in a subject of relatively incipient discussion. In the present text, probabilities of failure higher than 10^{3} are treated as concerning values, based on technical literature for applications in engineering. It is important to remark that the P_{f} evaluated in this paper refers to the probability of exceeding the nominal deterministic value of strength (Eq. 9), and does not take into account some possible safety factor applied in design procedure.
Conclusion
This paper aimed to provide a statistical characterization of some samples of structural ceramic blocks produced in the State of Alagoas, Brazil, over 2014, and its application to a probability design procedure, based in the structural reliability theory.
The inference indicated that Gaussian function was able to model the distribution of four out of seven dimensional parameters measured, being the other ones properly described by extremetype models, as Fréchet and Weibull. The prevailing of representative tail values in these latter samples is the major reason for this behavior. Regarding the data sets of compressive strength of the units, the goodnessoffit tests demonstrate that the Gaussian distribution fits adequately both the monthly samples and the global one.
In the analysis on the compressive strength of masonry, moderate to low values of failure probability have been observed, except for the case involving the strength of units produced in February 2014, which presented the lower performance between the data sets, leading to a failure probability around 5 x 10^{3}. It should be noted that the P_{f} values presented refer to the probability of exceeding masonry strength, when a deterministic load equals to the nominal strength is considered.
Reliabilitybased analysis proves to be a useful tool for industry and designers, to identifying aspects in which the manufacturing and quality inspection processes must be improved. The correlation values between dimensional parameters, for instance, provide the manufacturer the possibility to adjust its manufacturing process by modifying the clay mixtures, or even optimizing its molds.
There is another benefit of reliabilitybased procedures in structural design, which consists in the possibility of calibration of safety factors in design equations, aiming to achieve a userdefined target reliability index (or failure probability).
The results presented in this paper are only indicative to the probabilistic behavior of the design formulation studied. The P_{f} values themselves must be interpreted with caution, as they reflect the behavior of the specific statistical database addressed herein.