Notifications

Exam terms: Fri Jan 13, Wed Jan 18, Fri Jan 20, Mon Jan 23, Fri Jan 27, Wed Feb 1, Fri Feb. 3. Capacity: 12 students each. Schedule: morning 8:30-10:00 written part, afternoon 12:00-?? oral part.

There will be no exam terms in the week Feb. 6-Feb 10. One additional term may be made available during the first three weeks of the summer semester.

Schedule 

Lectures
Tuesday 15:40 - 17:10 K3  
Friday 13:10 - 14:40 K2  
Tutorial Classes (Link to Moodle)
Wednesday   9:00 - 10:30 K11 Instructor: Šárka Hudecová
Wednesday 10:40 - 12:10 K11 Instructor: Marek Omelka

Course Materials

The course contents has been revamped since its previous form (2021 and earlier). Hence, no course notes adapted to the current syllabus will be available this year. Students can use the link below to download the course notes from the previous year. Almost all the topics are covered there, although in a different order and a different level of detail.

Requirements for Credit/Exam 

Tutorial Credit:

The credit for the tutorial sessions will be awarded to the student who satisfies the following two conditions:

  1. Regular small assignments: A student needs to prepare acceptable solutions to at least 10 out of 12 tutorial class assignments. An assignment can be solved either during the corresponding tutorial class or the solution needs to be submitted within a pre-specified deadline.
  2. Project: A student needs to submit a project satisfying the requirements given in the assignment. A corrected version of an unsatisfactory project can be resubmitted once.

The nature of these requirements precludes any possibility of additional attempts to obtain the tutorial credit (with the exceptions listed above).

Exam:

The exam has two parts: written and oral, both conducted on the same day.

Detailed Course Syllabus  

  1. Introduction
    • Simple linear regression: technical and historical view
      Lecture 1, Sep. 30
  2. Linear regression model
    • Definition, assumptions
      Lecture 1, Sep. 30
    • Interpretation of regression parameters
      Lecture 2, Oct. 4
    • Least squares estimation (LSE)
      Lecture 2 , Oct. 4
    • Residual sums of squares, fitted values, hat matrix
      Lecture 3, Oct. 7
    • Geometric interpretation of LSE
      Lecture 3, Oct. 7
    • Equivalence of LR models
      Lecture 3, Oct. 7
    • Model with centered covariates
      Lecture 4, Oct. 11
    • Decomposition of sums of squares, coefficient of determination
      Lecture 4-5, Oct. 11 and 14
    • LSE under linear restrictions
      Lecture 5, Oct. 14
  3. Properties of LS estimates
    • Moment properties
      Lecture 6, Oct. 18
    • Gauss-Markov theorem
      Lecture 6, Oct. 18
    • Properties under normality
      Lecture 6, Oct. 18
  4. Statistical inference in LR model
    • Exact inference under normality
      Lecture 7, Oct. 21
    • Submodel testing
      Lecture 8, Oct. 25
    • One-way ANOVA model
      Lecture 8, Oct. 25
    • Connections to maximum likelihood theory
      Lecture 9, Nov. 1
    • Asymptotic inference with random covariates
      Lecture 9-10, Nov. 1 and 4
    • Asymptotic inference with fixed covariates
      Lecture 10, Nov. 4
  5. Predictions
    • Possible objectives or regression analysis.
      Lecture 10, Nov. 4
    • Pitfalls of predictions
      Lecture 10, Nov. 4
    • Confidence interval for estimated conditional mean of an existing/future observation
      Lecture 10, Nov. 4
    • Confidence interval for the response of a future observation
      Lecture 11, Nov. 8
  6. Model Checking and Diagnostic Methods I.
    • Residuals, standardized/studentized residuals
      Lecture 11, Nov. 8
    • Residual plots, QQ plots
      Lecture 11, Nov. 8
    • Checking homoskedasticity
      Lecture 11, Nov. 8
  7. Transformation of the response
    • Interpretation of log-transformed model
      Lecture 12, Nov. 11
    • Box-Cox transformation
      Lecture 12, Nov. 11
  8. Parametrization of a single covariate
    • Single factor covariate (one-way ANOVA model)
      Lecture 12-13, Nov. 11 and 15
    • Single numerical covariate
      Lecture 14-15, Nov. 18 and 22
  9. Multiple tests and simultaneous confidence intervals
    • Bonferroni method
      Lecture 15-16, Nov. 22 and 25
    • Tukey method
      Lecture 16-17, Nov. 25 and 29
    • Scheffé method
      Lecture 17, Nov. 29
    • Confidence band for the whole regression surface
      Lecture 17, Nov. 29
  10. Interactions
    • Interactions of two factors: two-way ANOVA
      Lecture 18, Dec. 2
    • Interactions of two numerical covariates
      Lecture 18, Dec. 2
    • Interactions of a numerical covariate with a factor
      Lecture 18, Dec. 2
  11. Regression model with multiple covariates
    • Decomposition of the model with additional covariate
      Lecture 19, Dec. 6
    • Effects on fitted values, residuals, SSe, parameter estimates, predictions
      Lecture 19, Dec. 6
    • Orthogonal covariates
      Lecture 20, Dec. 9
    • Decomposition of regression sum of squares
    • Multicollinearity, variance inflation factor
      Lecture 20, Dec. 9
    • Confounding bias, mediation, assessment of causality
      Lecture 21, Dec. 13
  12. Analysis of variance (ANOVA) models
    • One-way ANOVA review
      Lecture 22, Dec. 16
    • Two-way ANOVA with/without interactions
      Lecture 22, Dec. 16
    • Balanced two-way ANOVA
    • Nested factor effects
  13. Dealing with heteroskedasticity
    • Weighted least squares
    • White's sandwich estimator
  14. Covariate measurement errors
  15. Missing data issues in regression models
  16. Model-building strategies
    • Model choice based on sequential submodel testing
    • Functional form of numerical covariates
    • Inclusion of interactions
    • Goodness of fit measures
    • Step-wise procedures
    • Comparison to AI methods
  17. Model Checking and Diagnostic Methods II.
    • Independence of error terms
    • Leverage points, outliers
    • Influential observations
    • Jackknife residuals
    • DFBetas
    • Cook's distance