[lpi-discuss] Principles for creation of exam objectives (was: Some thoughts on Sendmail)

Torsten Scheck torsten.scheck at gmx.de
Thu Jul 21 06:51:26 EDT 2005


Dear friends,

I agree with the statements of these two gentlemen:

-------------8<-------------8<-------------8<-------------
Etienne Goyer wrote:
[...]
> I think a
> good grasp of high level SMTP concepts (MX, relay, spooling, interface
> with MDA, result code, etc) would be more important than the minutiae
> of a particuliar MTA.
[...]

Dimitrios Bogiatzoules wrote:
[...]
> The idea of being neutral is not selecting one MTA of your choice but
> to be able to do the task as described in the objectives regardless of
> what MTA you'll find on a machine. Just a scenario: a postfix admin
> moves to another company that uses exim. What would be the worth of a
> "postfix" centric certificate for him or his employer?
[...]
-------------8<-------------8<-------------8<-------------

LPI's standardised processes to create the exam objectives lack some
fundamental principles to make the best out of our fragmented FOSS
world. The current exam objectives review (led by Taki) already
incorporates some of my thoughts below, but given the current discussion
about sendmail and similar former discussions [1,2], I believe there is
a need to elaborate on implicit, explicit, and additionally needed
principles.


First of all, please remember, that LPI doesn't promise to correctly
assess the skills of any person who stumbles into an LPI exam. LPI's
exams work only in combination with the objectives. So, some candidates
need to learn one or the other additional application for the sake of
uniformity. On the other hand they gain additional expertise. :-)

The Job Task Analysis and our surveys can only identify and weigh job
tasks (e.g. add mail alias). Additionally we can measure the current
popularity of each application/tool, which can be used for the task. But
LPI is not about a popularity contest. We can't change the choice of
applications and tools every year. It's about creating quasi standards
for proficiency in Linux and FOSS technologies. This implies pedagogical
aspects like "Does this MTA support the candidate in his learning
process?" or "Can the knowledge about this MTA be easily transfered to
all other MTAs?".


So, what would be principles for a reasonable and comprehensible choice
of default applications or system utilities included in LPI's objectives?

My suggestion:

initial objective creation:
  * choose the most often used application, if there is a clear winner
  * otherwise consider pedagogical aspects and future trends

exam objective content:
  * focus on generic knowledge, which could also be applied to the other
    choices

use conservative approach for objective revisions:
   * substitute default application only when it really hurts
   * introduce new applications by adding knowledge about the
     main differences between them and the current default application


As for sendmail: The current exam objective review has the potential to
focus more on generic knowledge about MTAs and to reduce
Sendmail-specific knowledge, so that sendmail can remain our default.
But with the pedagogical aspects and the growth of postfix usage in
mind, I anticipate a change for the next big objective review (including
Job Task Analysis) in 2006/7.

Torsten


[1]
https://group.lpi.org/cgi-bin/publicwiki/view/Operations/RFC8

[2]
In fall 2004, there was a discussion about the problems of NFS and the
need to include other/better network filesystems (see thread: "just took
the 102 exam.. -- more personal opinion (on NFS)"). Bryan's main
conclusion was, that LPI's focus on NFS works well as a foundation of
how UNIX/Linux "mounts" remote filesystems.

-- 
Torsten Scheck <torsten.scheck at gmx.de>  Jabber:torsten at i0i0.de
GnuPG 1024D/728E 6696 F43D D622 78F1  F481 45C0 2147 69AB DD54
software engineer:open standards/access/knowledge:enthgnusiast




More information about the lpi-discuss mailing list