Welcome to xotl.tools’s documentation!

Collection of disparate utilities.

xotl.tools is essentially an extension to the Python’s standard library, it does not make up a full framework, but it’s very useful to be used from a diversity of scenarios.

What’s new in 2.1.10

  • Improve type hints for several modules.

    We run mypy 0.782 in a large project of ours that uses many modules of xotl.tools and we discovered no major roadblocks. So we think this deserves its own release.

    The list of modules we deem are complete:

  • Add official support for Python 3.9. We now our test

Contents

xotl.tools.bases - Numeric base 32 and base 64 integer representations

Integer encoding and decoding in different bases.

xotl.tools.bases.int2str(number, base=62)[source]

Return the string representation of an integer using a base.

Parameters:base (Either an integer or a string with a custom table.) – The base.

Examples:

>>> int2str(65535, 16)
'ffff'

>>> int2str(65535)
'h31'

>>> int2str(65110208921, 'merchise')
'ehimseiemsce'

>>> int2str(651102, 2)
'10011110111101011110'
xotl.tools.bases.str2int(src, base=62)[source]

Return the integer decoded from a string representation using a base.

Parameters:base (Either an integer or a string with a custom table.) – The base.

Examples:

>>> str2int('ffff', 16)
65535

>>> str2int('1c', 16) == int('1c', 16)
True

>>> base = 'merchise'
>>> number = 65110208921
>>> str2int(int2str(number, base), base) == number
False

>>> base = 32
>>> str2int(int2str(number, base), base) == number
True
class xotl.tools.bases.B32[source]

Handles base-32 conversions.

In base 32, each 5-bits chunks are represented by a single “digit”. Digits comprises all symbols in 0..9 and a..v.

>>> B32.inttobase(32) == '10'
True
>>> B32.basetoint('10')
32
class xotl.tools.bases.B64[source]

Handles [a kind of] base 64 conversions.

This is not standard base64, but a reference-friendly base 64 to help the use case of generating a short reference.

In base 64, each 6-bits chunks are represented by a single “digit”. Digits comprises all symbols in 0..9, a..z, A..Z and the three symbols: ()[.

>>> B64.inttobase(64) == '10'
True
>>> B64.basetoint('10')
64

Warning

In this base, letters are case sensitive:

>>> B64.basetoint('a')
10

>>> B64.basetoint('A')
36

xotl.tools.bound – Helpers for bounded execution of co-routines

New in version 1.6.3.

A bounded execution model

Some features are easy to implement using a generator or co-routine (PEP 342). For instance, you might want to “report units of work” one at a time. These kind of features could be easily programmed without any bounds whatsoever, and then you might “weave” the bounds.

This module helps to separate the work-doing function from the boundary-tests definitions.

This document uses the following terminology:

unbounded function

This is the function that does the actual work without testing for any boundary condition. Boundary conditions are not “natural causes” of termination for the algorithm but conditions imposed elsewhere: the environment, resource management, etc.

This function must return a generator, called the unbounded generator.

unbounded generator
The generator returned by an unbounded function. This generator is allowed to yield forever, although it could terminate by itself. So this is actually a possibly unbounded generator, but we keep the term to emphasize.
boundary condition

It’s a condition that does not belong to the logical description of any algorithm. When this condition is met it indicates that the unbounded generator should be closed. The boundary condition is tested each time the unbounded generator yields.

A boundary condition is usually implemented in a single function called the boundary definition.

boundary definition

A function that implements a boundary condition. This function must comply with the boundary protocol (see boundary()).

Sometimes we identify the boundary condition with its boundary definition.

bounded function
It’s the result of applying a boundary definition to an unbounded function.
bounded generator
It’s the result of applying a boundary condition to an unbounded generator.

The bounded execution model takes at least an unbounded generator and a boundary condition. Applying the boundary condition to the unbounded generator ultimately results in a bounded generator, which will behave almost equivalently to the unbounded generator but will stop when the boundary condition yields True or when the unbounded generator itself is exhausted.

Included boundary conditions

xotl.tools.bound.timed(maxtime)[source]

Becomes True after a given amount of time.

The bounded generator will be allowed to yields values until the maxtime time frame has elapsed.

Usage:

@timed(timedelta(seconds=60))
def do_something_in_about_60s():
    while True:
        yield

Note

This is a very soft limit.

We can’t actually guarrant any enforcement of the time limit. If the bounded generator takes too much time or never yields this predicated can’t do much. This usually helps with batch processing that must not exceed (by too much) a given amount of time.

The timer starts just after the next() function has been called for the predicate initialization. So if the maxtime given is too short this predicated might halt the execution of the bounded function without allowing any processing at all.

If maxtime is not a timedelta, the timedelta will be computed as timedelta(seconds=maxtime).

xotl.tools.bound.times(n)[source]

Becomes True after a given after the nth item have been produced.

xotl.tools.bound.accumulated(mass, *attrs, initial=0)[source]

Becomes True after accumulating a given “mass”.

mass is the maximum allowed to accumulate. This is usually a positive number. Each value produced by the unbounded generator is added together. Yield True when this amount to more than the given mass.

If any attrs are provided, they will be considered attributes (or keys) to search inside the yielded data from the bounded function. If no attrs are provided the whole data is accumulated, so it must allow addition. The attribute to be summed is extracted with get_first_of(), so only the first attribute found is added.

If the keyword argument initial is provided the accumulator is initialized with that value. By default this is 0.

xotl.tools.bound.pred(func, skipargs=True)[source]

Allow “normal” functions to engage within the boundary protocol.

func should take a single argument and return True if the boundary condition has been met.

If skipargs is True then function func will not be called with the tuple (args, kwargs) upon initialization of the boundary, in that case only yielded values from the unbounded generator are passed. If you need to get the original arguments, set skipargs to False, in this case the first time func is called will be passed a single argument (arg, kwargs).

Example:

>>> @pred(lambda x: x > 10)
... def fibonacci():
...     a, b = 1, 1
...     while True:
...        yield a
...        a, b = b, a + b

>>> fibonacci()
13
xotl.tools.bound.until_errors(*errors)[source]

Becomes True after any of errors has been raised.

Any other exceptions (except GeneratorExit) is propagated. You must pass at least an error.

Normally this will allow some possibly long jobs to be interrupted (SoftTimeLimitException in celery task, for instance) but leave some time for the caller to clean up things.

It’s assumed that your job can be properly finalized after any of the given exceptions has been raised.

Parameters:on_error – A callable that will only be called if the boundary condition is ever met, i.e if any of errors was raised. The callback is called before yielding True.

New in version 1.7.2.

Changed in version 1.7.5: Added the keyword argument on_error.

xotl.tools.bound.until(time=None, times=None, errors=None)[source]

An idiomatic alias to other boundary definitions.

  • until(maxtime=n) is the same as timed(n).
  • until(times=n) is the same as times(n).
  • until(pred=func, skipargs=skip) is the same as pred(func, skipargs=skip).
  • until(errors=errors, **kwargs) is the same as until_errors(*errors, **kwargs).
  • until(accumulate=mass, path=path, initial=initial) is the same as
    accumulated(mass, *path.split('.'), initial=initial)

Warning

You cannot mix many calls.

New in version 1.7.2.

Chaining several boundary conditions

To created a more complex boundary than the one provided by a single condition you could use the following high-level boundaries:

xotl.tools.bound.whenany(*boundaries)[source]

An OR-like boundary condition.

It takes several boundaries and returns a single one that behaves like the logical OR, i.e, will yield True when any of its subordinate boundary conditions yield True.

Calls close() of all subordinates upon termination.

Each boundary should be either:

  • A “bare” boundary definition that takes no arguments.
  • A boundary condition (i.e an instance of BoundaryCondition). This is result of calling a boundary definition.
  • A generator object that complies with the boundary protocol. This cannot be tested upfront, a misbehaving generator will cause a RuntimeError if a boundary protocol rule is not followed.

Any other type is a TypeError.

xotl.tools.bound.whenall(*boundaries)[source]

An AND-like boundary condition.

It takes several boundaries and returns a single one that behaves like the logical AND i.e, will yield True when all of its subordinate boundary conditions have yielded True.

It ensures that once a subordinate yields True it won’t be sent more data, no matter if other subordinates keep on running and consuming data.

Calls close() of all subordinates upon termination.

Each boundary should be either:

  • A “bare” boundary definition that takes no arguments.
  • A boundary condition (i.e an instance of BoundaryCondition). This is result of calling a boundary definition.
  • A generator object that complies with the boundary protocol. This cannot be tested upfront, a misbehaving generator will cause a RuntimeError if a boundary protocol rule is not followed.

Any other type is a TypeError.

Defining boundaries

If none of the boundaries defined deals with a boundary condition you have, you may create another one using boundary(). This is usually employed as decorator on the boundary definition.

xotl.tools.bound.boundary(definition)[source]

Helper to define a boundary condition.

The definition must be a function that returns a generator. The following rules must be followed. Collectively these rules are called the boundary protocol.

  • The boundary definition will yield True when and only when the boundary condition is met. Only the value True will signal the boundary condition.

  • The boundary definition must yield at least 2 times:

    • First it will be called its next() method to allow for initialization of internal state.
    • Immediately after, it will be called its send() passing the tuple (args, kwargs) with the arguments passed to the unbounded function. At this point the boundary definition may yield True to halt the execution. In this case, the unbounded generator won’t be asked for any value.
  • The boundary definition must yield True before terminating with a StopIteration. For instance the following definition is invalid cause it ends without yielding True:

    @boundary
    def invalid():
        yield
        yield False
    
  • The boundary definition must deal with GeneratorExit exceptions properly since we call the close() method of the generator upon termination. Termination occurs when the unbounded generator stops by any means, even when the boundary condition yielded True or the generator itself is exhausted or there’s an error in the generator.

    Both whenall() and whenany() call the close() method of all their subordinate boundary conditions.

    Most of the time this reduces to not catching GeneratorExit exceptions.

A RuntimeError may happen if any of these rules is not followed by the definition. Furthermore, this error will occur when invoking the bounded function and not when applying the boundary to the unbounded generator.

Illustration of a boundary

Let’s explain in detail the implementation of times() as an example of how a boundary condition could be implemented.

1
2
3
4
5
6
7
8
9
@boundary
def times(n):
    '''Becomes True after the `nth` item have been produced.'''
    passed = 0
    yield False
    while passed < n:
        yield False
        passed += 1
    yield True

We implemented the boundary condition via the boundary() helper. This helpers allows to implement the boundary condition via a boundary definition (the function above). The boundary helper takes the definition and builds a BoundaryCondition instance. This instance can then be used to decorate the unbounded function, returning a bounded function (a Bounded instance).

When the bounded function is called, what actually happens is that:

  • First the boundary condition is invoked passing the n argument, and thus we obtain the generator from the times function.

  • We also get the generator from the unbounded function.

  • Then we call next(boundary) to allow the times boundary to initialize itself. This runs the code of the times definition up to the line 5 (the first yield statement).

  • The bounded function ignores the message from the boundary at this point.

  • Then it sends the arguments passed to original function via the send() method of the boundary condition generator.

  • This unfreezes the boundary condition that now tests whether passes is less that n. If this is true, the boundary yields False and suspends there at line 7.

  • The bounded function see that message is not True and asks the unbounded generator for its next value.

  • Then it sends that value to the boundary condition generator, which resumes execution at line 8. The value sent is ignored and passes gets incremented by 1.

  • Again the generator asks if passes is less that n. If passes has reached n, it will execute line 9, yielding True.

  • The bounded function see that the boundary condition is True and calls the close() method to the boundary condition generator.

  • This is like raising a GeneratorExit just after resuming the times below line 9. The error is not trapped and propagates the close() method of the generator knows this means the generator has properly finished.

    Note

    Other boundaries might need to deal with GeneratorExit explicitly.

  • Then the bounded function regains control and calls the close() method of the unbounded generator, this effectively raises a GeneratorExit inside the unbounded generator, which if untreated means everything went well.

If you look at the implementation of the included boundary conditions, you’ll see that all have the same pattern:

  1. Initialization code, followed by a yield False statement. This is a clear indicator that the included boundary conditions disregard the first message (the arguments to the unbounded function).
  2. A looping structure that tests the condition has not been met and yields False at each cycle.
  3. The yield True statement outside the loop to indicate the boundary condition has been met.

This pattern is not an accident. Exceptionally whenall() and whenany() lack the first standalone yield False because they must not assume all its subordinate predicates will ignore the first message.

Internal API

class xotl.tools.bound.Bounded(target)[source]

The bounded function.

This is the result of applying a boundary definition to an unbounded function (or generator).

If target is a function this instance can be called several times. If it’s a generator then it will be closed after either calling (__call__) this instance, or consuming the generator given by generate().

This class is actually subclassed inside the apply() so that the weaving boundary definition with the target unbounded function is not exposed.

__call__(*args, **kwargs)[source]

Return the last value from the underlying bounded generator.

generate(*args, **kwargs)[source]

Return the bounded generator.

This method exposes the bounded generator. This allows you to “see” all the values yielded by the unbounded generator up to the point when the boundary condition is met.

class xotl.tools.bound.BoundaryCondition(definition, name=None, errors=None)[source]

Embodies the boundary protocol.

The definition argument must a function that implements a boundary definition. This function may take arguments to initialize the state of the boundary condition.

Instances are callables that will return a Bounded subclass specialized with the application of the boundary condition to a given unbounded function (target). For instance, times(6) returns a class, that when instantiated with a target represents the bounded function that takes the 6th valued yielded by target.

If the definition takes no arguments for initialization you may pass the target directly. This is means that if __call__() receives arguments they will be used to instantiate the Bounded subclass, ie. this case allows only a single argument target.

If errors is not None it should be a tuple of exceptions to catch and throw inside the boundary condition definition. Other exceptions, beside GeneratorExit and StopIteration, are not handled (so the bubble up). See until_error().

An example: time bounded batch processing

We have a project in which we need to send emails inside a cron task (celery is not available). Emails to be sent are placed inside an Outbox but we may only spent about 60 seconds to send as many emails as we can. If our emails are reasonably small (i.e will be delivered to the SMTP server in a few miliseconds) we could use the timed() predicate to bound the execution of the task:

@timed(50)
def send_emails():
   outbox = Outbox.open()
   try:
      for message in outbox:
         emailbackend.send(message)
         outbox.remove(message)
         yield message
   except GeneratorExit:
      # This means the time we were given is off.
      pass
   finally:
      outbox.close()  # commit the changes to the outbox

Notice that you must enclose your batch-processing code in a try statement if you need to somehow commit changes. Since we may call the close() method of the generator to signal that it must stop.

A finally clause is not always appropriated cause an error that is not GeneratorExit error should not commit the data unless you’re sure data changes that were made before the error could be produced. In the code above the only place in the code above where an error could happen is the sending of the email, and the data is only touched for each email that is actually sent. So we can safely close our outbox and commit the removal of previous message from the outbox.

Using the Bounded.generate() method

Calling a bounded generator simply returns the last valued produced by the unbounded generator, but sometimes you need to actually see all the values produced. This is useful if you need to meld several generators with partially overlapping boundary conditions.

Let’s give an example by extending a bit the example given in the previous section. Assume you now need to extend your cron task to also read an Inbox as much as it can and then send as many messages as it can. Both things should be done under a given amount of time, however the accumulated size of sent messages should not surpass a threshold of bytes to avoid congestion.

For this task you may use both timed() and accumulated(). But you must apply accumulated() only to the process of sending the messages and the timed boundary to the overall process.

This can be accomplished like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
def communicate(interval, bandwidth):
    from itertools import chain as meld

    def receive():
        for message in Inbox.receive():
           yield message

    @accumulated(bandwith, 'size')
    def send():
        for message in Outbox.messages():
            yield message

    @timed(interval)
    def execute():
        for _ in meld(receive(), send.generate()):
            yield
    return execute()

Let’s break this into its parts:

  • The receive function reads the Inbox and yields each message received.

    It is actually an unbounded function but we don’t want to bound its execution in isolation.

  • The send unbounded function sends every message we have in the Outbox and yields each one. In this case we can apply the accumulated boundary to get a Bounded instance.

  • Then we define an execute function bounded by timed. This function melds the receive and send processes, but we can’t actually call send because we need to yield after each message has been received or sent. That’s why we need to call the generate() so that the time boundary is also applied to the sending process.

Note

The structure from this example is actually taken from a real program, although simplified to serve better for learning. For instance, in our real-world program bandwidth could be None to indicate no size limit should be applied to the sending process. Also in the example we’re not actually saving nor sending messages!

xotl.tools.cli – Command line application facilities

Applications

Tools

Utilities for command-line interface (CLI) applications.

  • program_name(): calculate the program name from “sys.argv[0]”.
  • command_name()\ : calculate command names using class names in lower
    case inserting a hyphen before each new capital letter.
xotl.tools.cli.tools.command_name(cls)[source]

Calculate a command name from given class.

Names are calculated putting class names in lower case and inserting hyphens before each new capital letter. For example “MyCommand” will generate “my-command”.

It’s defined as an external function because a class method don’t apply to minimal commands (those with only the “run” method).

Example:

>>> class SomeCommand:
...     pass

>>> command_name(SomeCommand) == 'some-command'
True

If the command class has an attribute command_cli_name, this will be used instead:

>>> class SomeCommand:
...    command_cli_name = 'adduser'

>>> command_name(SomeCommand) == 'adduser'
True

It’s an error to have a non-string command_cli_name attribute:

>>> class SomeCommand:
...    command_cli_name = None

>>> command_name(SomeCommand)  
Traceback (most recent call last):
   ...
TypeError: Attribute 'command_cli_name' must be a string.
xotl.tools.cli.tools.hyphen_name(name, join_numbers=True)[source]

Convert a name to a hyphened slug.

Expects a name in Camel-Case. All invalid characters (those invalid in Python identifiers) are ignored. Numbers are joined with preceding part when join_numbers is True.

For example:

>>> hyphen_name('BaseNode') == 'base-node'
True

>> hyphen_name('--__ICQNámeP12_34Abc--') == 'icq-name-p12-34-abc'
True

>> hyphen_name('ICQNámeP12', join_numbers=False) == 'icq-name-p-12'
True
xotl.tools.cli.tools.program_name()[source]

Calculate the program name from “sys.argv[0]”.

xotl.tools.clipping - Object string representation protocol

Complements for object string representation protocol.

There are contexts that using str or repr protocol would be inadequate because shorter string representations are expected (e.g. formatting recursive objects in pprint standard module that they have a new Boolean parameter in Python 3 named compact).

There is a protocol to complement operators used by standard string representation functions (__str__, __repr__) by defining a new one with name __crop__. This operator will receive some extra parameters with default values, see crop() function for details.

xotl.tools.clipping.crop(obj, max_width=None, canonical=False)[source]

Return a reduced string representation of obj.

Classes can now define a new special attribute __crop__. It can be a string (or unicode in Python 2). Or a method:

def __crop__(self, max_width=None, canonical=False):
    pass

If the obj does not implement the __crop__ protocol, a standard one is computed.

Parameters:
  • max_width – Maximum length for the resulting string. If is not given, defaults to DEFAULT_MAX_WIDTH.
  • canonical – If True repr() protocol must be used instead str() (the default).

New in version 1.8.0.

xotl.tools.clipping.crop_iterator(obj, max_width=None, canonical=False)[source]

Return a reduced string representation of the iterator obj.

See crop() function for a more general tool.

If max_width is not given, defaults to DEFAULT_MAX_WIDTH.

New in version 1.8.0.

xotl.tools.context - Simple execution contexts

Note

About thread-locals and contexts.

The context uses internally a thread-local instance to keep context stacks in different threads from seeing each other.

If, when this module is imported, greenlet is imported already, greenlet isolation is also warranted (which implies thread isolation).

If you use collaborative multi-tasking based in other framework other than greenlet, you must ensure to monkey patch the threading.local class so that isolation is kept.

In future releases of xotl.tools, we plan to provide a way to inject a “process” identity manager so that other frameworks be easily integrated.

Changed in version 1.7.1: Changed the test about greenlet. Instead of testing for greenlet to be importable, test it is imported already.

Changed in version 1.6.9: Added direct greenlet isolation and removed the need for gevent.local.

New in version 1.6.8: Uses gevent.local if available to isolate greenlets.

xotl.tools.cpystack - Utilities to inspect the CPython’s stack

Utilities to inspect the CPython’s stack.

xotl.tools.cpystack.getargvalues(frame)[source]

Inspects the given frame for arguments and returns a dictionary that maps parameters names to arguments values. If an * argument was passed then the key on the returning dictionary would be formatted as <name-of-*-param>[index].

For example in the function:

>>> def autocontained(a, limit, *margs, **ks):
...    import sys
...    return getargvalues(sys._getframe())

>>> autocontained(1, 12)['limit']
12

>>> autocontained(1, 2, -10, -11)['margs[0]']
-10
xotl.tools.cpystack.error_info(*args, **kwargs)[source]

Get error information in current trace-back.

No all trace-back are returned, to select which are returned use:

  • args: Positional parameters

    • If string, represent the name of a function.
    • If an integer, a trace-back level.

    Return all values.

  • kwargs: The same as args but each value is a list of local names to return. If a value is True, means all local variables.

Return a list with a dict in each item.

Example:

>>> def foo(x):
...     x += 1//x
...     if x % 2:
...         bar(x - 1)
...     else:
...         bar(x - 2)

>>> def bar(x):
...     x -= 1//x
...     if x % 2:
...         foo(x//2)
...     else:
...         foo(x//3)

>>> try:    
...     foo(20)
... except:
...     print(printable_error_info('Example', foo=['x'], bar=['x']))
Example
   ERROR: integer division or modulo by zero
   ...
xotl.tools.cpystack.object_info_finder(obj_type, arg_name=None, max_deep=25)[source]

Find an object of the given type through all arguments in stack frames.

Returns a tuple with the following values:
(arg-value, arg-name, deep, frame).

When no object is found None is returned.

Arguments:
object_type: a type or a tuple of types as in “isinstance”. arg_name: the arg_name to find; if None find in all arguments max_deep: the max deep to enter in the stack frames.
xotl.tools.cpystack.object_finder(obj_type, arg_name=None, max_deep=25)[source]

Find an object of the given type through all arguments in stack frames.

The difference with object_info_finder() is that this function returns the object directly, not a tuple.

xotl.tools.cpystack.track_value(value, max_deep=25)[source]

Find a value through all arguments in stack frames.

Returns a dictionary with the full-context in the same level as “value”.

xotl.tools.cpystack.iter_stack(max_deep=25)[source]

Iterates through stack frames until exhausted or max_deep is reached.

To find a frame fulfilling a condition use:

frame = next(f for f in iter_stack() if condition(f))

New in version 1.6.8.

xotl.tools.cpystack.iter_frames(max_deep=25)[source]

Iterates through all stack frames.

Returns tuples with the following:

(deep, filename, line_no, start_line).

New in version 1.1.3.

Deprecated since version 1.6.8: The use of params attr_filter and value_filter.

xotl.tools.crypto - Other cryptographic services

General security tools.

Adds the ability to generate new passwords using a source pass-phrase and a secury strong level.

xotl.tools.crypto.generate_password(pass_phrase, level=3)[source]

Generate a password from a source pass-phrase and a security level.

Parameters:
  • pass_phrase – String pass-phrase to be used as base of password generation process.
  • level – Numerical security level (the bigger the more secure, but don’t exaggerate!).

When pass_phrase is a valid string, level means a generation method. Each level implies all other with an inferior numerical value.

There are several definitions with numerical values for level (0-4):

PASS_PHRASE_LEVEL_BASIC

Generate the same pass-phrase, just removing invalid characters and converting the result to lower-case.

PASS_PHRASE_LEVEL_MAPPED

Replace some characters with new values: 'e'->'3', 'i'->'1', 'o'->'0', 's'->'5'.

PASS_PHRASE_LEVEL_MAPPED_MIXED

Consonants characters before ‘M’ (included) are converted to upper-case, all other are kept lower-case.

PASS_PHRASE_LEVEL_MAPPED_DATED

Adds a suffix with the year of current date (“<YYYY>”).

PASS_PHRASE_LEVEL_STRICT

Randomly scramble previous result until unbreakable strong password is obtained.

If pass_phrase is None or an empty string, generate a “secure salt” (a password not based in a source pass-phrase). A “secure salt” is generated by scrambling the concatenation of a random phrases from the alphanumeric vocabulary.

Returned password size is 4*level except when a pass-phrase is given for level <= 4 where depend on the count of valid characters of pass-phrase argument, although minimum required is warranted. When pass-phrase is None for level zero or negative, size 4 is assumed. First four levels are considered weak.

Maximum size is defined in the MAX_PASSWORD_SIZE constant.

Default level is PASS_PHRASE_LEVEL_MAPPED_DATED when using a pass-phrase.

xotl.tools.crypto.PASS_PHRASE_LEVEL_BASIC = 0

The most basic level (less ) for the password generation.

xotl.tools.crypto.PASS_PHRASE_LEVEL_MAPPED = 1

A level for simply mapping of several chars.

xotl.tools.crypto.PASS_PHRASE_LEVEL_MAPPED_MIXED = 2

Another “stronger” mapping level.

xotl.tools.crypto.PASS_PHRASE_LEVEL_MAPPED_DATED = 3

Appends the year after mapping.

xotl.tools.crypto.PASS_PHRASE_LEVEL_STRICT = 4

Totally scramble the result, making very hard to predict the result.

xotl.tools.crypto.DEFAULT_PASS_PHRASE_LEVEL = 3

The default level for generate_password()

xotl.tools.crypto.MAX_PASSWORD_SIZE = 512

An upper limit for generated password length.

xotl.tools.decorator - Several decorators

This module contains several useful decorators, for several purposed. Also it severs as a namespace for other well-defined types of decorators.

Warning

This modules will be progressively deprecated during the 1.6 series.

We feel that either xotl.tools.objects or xotl.tools.functools are a better match for some of these decorators. But since we need to make sure about keeping dependencies, the deprecation won’t be final until 1.7.0. After 1.8.0, this modules will be finally removed.

Top-level decorators

class xotl.tools.decorator.AttributeAlias(attr_name)[source]

Descriptor to create aliases for object attributes.

This descriptor is mainly to be used internally by aliases() decorator.

xotl.tools.decorator.settle(**kwargs)[source]

Returns a decorator to settle attributes to the decorated target.

Usage:

>>> @settle(name='Name')
... class Person:
...    pass

>>> Person.name
'Name'
xotl.tools.decorator.namer(name, **kwargs)[source]

Like settle(), but ‘__name__’ is a required positional argument.

Usage:

>>> @namer('Identity', custom=1)
... class I:
...    pass

>>> I.__name__
'Identity'

>>> I.custom
1
xotl.tools.decorator.aliases(*names, **kwargs)[source]

In a class, create an AttributeAlias descriptor for each definition as keyword argument (alias=existing_attribute).

If “names” are given, then the definition context is looked and are assigned to it the same decorator target with all new names:

>>> @aliases('foo', 'bar')
... def foobar(*args):
...     'This function is added to its module with two new names.'
xotl.tools.decorator.assignment_operator(func, maybe_inline=False)[source]

Makes a function that receives a name, and other args to get its first argument (the name) from an assignment operation, meaning that it if its used in a single assignment statement the name will be taken from the left part of the = operator.

Warning

This function is dependant of CPython’s implementation of the language and won’t probably work on other implementations. Use only you don’t care about portability, but use sparingly (in case you change your mind about portability).

xotl.tools.decorator.instantiate(target, *args, **kwargs)[source]

Some singleton classes must be instantiated as part of its declaration because they represents singleton objects.

Every argument, positional or keyword, is passed as such when invoking the target. The following two code samples show two cases:

>>> @instantiate
... class Foobar:
...    def __init__(self):
...        print('Init...')
Init...


>>> @instantiate('test', context={'x': 1})
... class Foobar:
...    def __init__(self, name, context):
...        print('Initializing a Foobar instance with name={name!r} '
...              'and context={context!r}'.format(**locals()))
Initializing a Foobar instance with name='test' and context={'x': 1}

In all cases, Foobar remains the class, not the instance:

>>> Foobar  
<class '...Foobar'>
xotl.tools.decorator.constant_bagger(func, *args, **kwds)[source]

Create a “bag” with constant values.

Decorated object must be a callable, but the result will be a class containing the constant values.

For example:

>>> @constant_bagger
... def MYBAG():
...     return dict(X=1, Y=2)

It will generate:

class MYBAG:
    X = 1
    Y = 2

When called with arguments, these will be used as actual arguments for the decorated function:

>>> @constant_bagger(X=1, Y=2)
... def MYBAG(**kwds):
...     return kwds

Constant bags are singletons that can be updated:

>>> MYBAG(Z=3) is MYBAG
True

>>> MYBAG.Z
3
class xotl.tools.decorator.memoized_instancemethod(fget, doc=None)[source]

Decorate a method memoize its return value.

Best applied to no-arg methods: memoization is not sensitive to argument values, and will always return the same value even when called with different arguments.

This is extracted from the SQLAlchemy project’s codebase, merit and copyright goes to SQLAlchemy authors:

Copyright (C) 2005-2011 the SQLAlchemy authors and contributors

This module is part of SQLAlchemy and is released under the MIT License:
http://www.opensource.org/licenses/mit-license.php

Sub packages

xotl.tools.decorator.development - Decorators for development annotations
xotl.tools.decorator.development.unstable(target, msg=None)[source]

Declares that a method, class or interface is unstable.

This has the side-effect of issuing a warning the first time the target is invoked.

The msg parameter, if given, should be string that contains, at most, two positional replacement fields ({0} and {1}). The first replacement field will be the type of target (interface, class or function) and the second matches target’s full name.

xotl.tools.decorator.meta - Decorator-making facilities

Decorator-making facilities.

This module provides a signature-keeping version of the xotl.tools.decorators.decorator(), which is now deprecated in favor of this module’s version.

We scinded the decorator-making facilities from decorators per se to allow the module xotl.tools.deprecation to be used by decorators and at the same time, implement the decorator deprecated() more easily.

This module is an adapted work from the decorator version 3.3.2 package and is copyright of its owner as stated below. Adaptation work is done by Merchise.

Original copyright and license notices from decorator package:

Copyright (c) 2005-2011, Michele Simionato

All rights reserved.

Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. Redistributions in bytecode form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

class xotl.tools.decorator.meta.FunctionMaker(func=None, name=None, signature=None, defaults=None, doc=None, module=None, funcdict=None)[source]

An object with the ability to create functions with a given signature. It has attributes name, doc, module, signature, defaults, dict and methods update and make.

classmethod create(obj, body, evaldict, defaults=None, doc=None, module=None, addsource=True, **attrs)[source]

Create a function from the strings name, signature and body. “evaldict” is the evaluation dictionary. If addsource is true an attribute __source__ is added to the result. The attributes attrs are added, if any.

make(src_templ, evaldict=None, addsource=False, **attrs)[source]

Make a new function from a given template and update the signature

update(func, **kw)[source]

Update the signature of func with the data in self

xotl.tools.decorator.meta.flat_decorator(caller, func=None)[source]

Creates a signature keeping decorator.

decorator(caller) converts a caller function into a decorator.

decorator(caller, func) decorates a function using a caller.

Deprecated since version 1.9.9: Use the decorator package.

xotl.tools.decorator.meta.decorator(caller)[source]

Eases the creation of decorators with arguments. Normally a decorator with arguments needs three nested functions like this:

def decorator(*decorator_arguments):
    def real_decorator(target):
        def inner(*args, **kwargs):
            return target(*args, **kwargs)
        return inner
    return real_decorator

This decorator reduces the need of the first level by comprising both into a single function definition. However it does not removes the need for an inner function:

>>> @decorator
... def plus(target, value):
...    from functools import wraps
...    @wraps(target)
...    def inner(*args):
...        return target(*args) + value
...    return inner

>>> @plus(10)
... def ident(val):
...     return val

>>> ident(1)
11

A decorator with default values for all its arguments (except, of course, the first one which is the decorated target) may be invoked without parenthesis:

>>> @decorator
... def plus2(func, value=1, missing=2):
...    from functools import wraps
...    @wraps(func)
...    def inner(*args):
...        print(missing)
...        return func(*args) + value
...    return inner

>>> @plus2
... def ident2(val):
...     return val

>>> ident2(10)
2
11

But (if you like) you may place the parenthesis:

>>> @plus2()
... def ident3(val):
...     return val

>>> ident3(10)
2
11

However, this is not for free, you cannot pass a single positional argument which type is a function:

>>> def p():
...    print('This is p!!!')

>>> @plus2(p)   
... def dummy():
...    print('This is dummy')
Traceback (most recent call last):
    ...
TypeError: p() takes ...

The workaround for this case is to use a keyword argument.

xotl.tools.deprecation - Utils for marking deprecated elements

xotl.tools.deprecation.deprecated(replacement, msg=None, deprecated_module=None, removed_in_version=None, check_version=False)[source]

Small decorator for deprecated functions.

Usage:

@deprecated(new_function)
def deprecated_function(...):
    ...
Parameters:
  • replacement – Either a string or the object that replaces the deprecated.
  • msg

    A deprecation warning message template. You should provide keyword arguments for the format() function. Currently we pass the current keyword arguments: replacement (after some processing), funcname with the name of the currently deprecated object and in_version with the version this object is going to be removed if removed_in_version argument is not None.

    Defaults to: “{funcname} is now deprecated and it will be removed{in_version}. Use {replacement} instead.”

  • removed_in_version – The version the deprecated object is going to be removed.
  • check_version

    If True and removed_in_version is not None, then declarations of obseleted objects will raise a DeprecationError. This helps the release manager to keep the release clean.

    Note

    Currently only works with setuptools’ installed distributions.

  • deprecated_module – If provided, the name of the module the deprecated object resides. Not needed if the deprecated object is a function or class.
  • new_name – If provided, it’s used as the name of the deprecated object. Needed to allow renaming in import_deprecated() helper function.

Note

Deprecating some classes in Python 3 could fail. This is because those classes do not declare a ‘__new__’ par of the declared ‘__init__’. The problem is solved if the ‘__new__’ of the super-class has no arguments. This doesn’t happen in Python 2.

To solve these cases mark the deprecation in a comment and issue the warning directly in the constructor code.

Changed in version 1.4.1: Introduces removed_in_version and check_version.

xotl.tools.deprecation.deprecated_alias(f, **kwargs)[source]

Declare a deprecated alias.

This is roughly the same as deprecated(f)(f); which is makes it convenient to give a better name to an already released function f, while keeping the old name as a deprecated alias.

New in version 2.1.0.

xotl.tools.deprecation.import_deprecated(module, *names, **aliases)[source]

Import functions deprecating them in the target module.

The target module is the caller of this function (only intended to be called in the global part of a module).

Parameters:
  • module – The module from which functions will be imported. Could be a string, or an imported module.
  • names – The names of the functions to import.
  • aliases – Keys are the new names, values the old names.

For example:

>>> from xotl.tools.deprecation import import_deprecated
>>> import math
>>> import_deprecated(math, 'sin', new_cos='cos')
>>> sin is not math.sin
True

Next examples are all True, but them print the deprecation warning when executed:

>>> sin(math.pi/2) == 1.0
>>> new_cos(2*math.pi) == math.cos(2*math.pi)

If no identifier is given, it is assumed equivalent as from module import *.

The statement import_deprecated('math', 'sin', new_cos='cos') has the same semantics as from math import sin, cos as new_cos, but deprecating current module symbols.

This function is provided for easing the deprecation of whole modules and should not be used to do otherwise.

xotl.tools.deprecation.deprecate_module(replacement, msg=None)[source]

Deprecate an entire module.

This function must be called in the global context of the deprecated module.

Parameters:replacement – The name of replacement module.

For example:

>>> from xotl.tools.deprecation import deprecate_module
>>> deprecate_module('xotl.tools.symbols')
>>> del deprecate_module
xotl.tools.deprecation.deprecate_linked(check=None, msg=None)[source]

Deprecate an entire module if used through a link.

This function must be called in the global context of the new module.

Parameters:check – Must be a module name to check, it must be part of the actual module name. If not given ‘xotl.tools.future’ is assumed.

For example:

>>> from xotl.tools.deprecation import deprecate_linked
>>> deprecate_linked()
>>> del deprecate_linked

xotl.tools.dim - Facilities to work with concrete numbers

The name ‘dim’ is a short of dimension. We borrow it from the topic dimensional analysis, even though the scope of this module is less ambitious.

This module is divided in two major parts: meta-definitions and applications.

xotl.tools.dim.meta – Meta-definitions for concrete numbers.

UNIT

This the constant value 1. It’s given this name to emphasize it’s the canonical unit for a dimension.

SCALAR

The signature of dimensionless quantities.

xotl.tools.dim.base - The base `physical quantities`_

Aliases
class L

An alias of Length

class T

An alias of Time

class M

An alias of Mass

class I

An alias of ElectricCurrent

class O

An alias of Temperature. We can’t really use the Greek Theta Θ

class N

An alias of Substance

class J

An alias of Luminosity

Derived quantities
class Area

Defined as L**2.

metre_squared

The canonical unit.

class Volume

Defined as L**3.

metre_cubic

The canonical unit.

class Frequency

Defined as T**-1 (which is the same as 1/T).

unit_per_second

The canonical unit.

Aliases of the canonical unit:

Hz
class Force

Defined as L * M * T**-2.

metre_kilogram_per_second_squared

The canonical unit.

Aliases of the canonical unit:

N
Newton
class Presure

Defined as L**-1 * M * T**-2.

kilogram_per_metre_per_second_squared

Aliases of the canonical unit:

pascal
Pa
class Velocity

Defined as L * T**-1.

metre_per_second

The canonical unit.

class Acceleration

Defined as L * T**-2.

metre_per_second_squared

The canonical unit.

On the automatically created names for derived quantities

We automatically create the name of the canonical unit of quantities derived from others by simple rules:

  • A * B joins the canonical unit names together with a low dash ‘_’ in-between. Let’s represent it as a_b, where a stands for the name of the canonical unit of A and b, the canonical unit of B.

    For the case, A * A the unit name is a_squared.

  • A/B gets the name a_per_b. 1/A gets the name unit_per_a

  • A**n; when n=1 this is the same as A; when n=2 this is the same as A * A; for other positive values of n, the canonical unit name is a_pow_n; for negative values of n is the same as 1/A**n; for n=0 this is the Scalar quantity.

xotl.tools.dim.currencies – Concrete numbers for money

xotl.tools.fp – Functional Programming in Python

Advanced functional programming in Python.

Note

This module is in EXPERIMENTAL state, we encourage not to use it before declared stable.


Ideally, a function only takes inputs and produce outputs, and doesn’t have any internal state that affects the output produced for a given input (like in Haskell).

Contents

xotl.tools.fp.iterators – High-level functional tools for iterators

Functional tools for functions that returns iterators (generators, etc.)

Warning

This module is experimental. It may be removed completely, moved or otherwise changed.

xotl.tools.fp.iterators.kleisli_compose(*fs) → Callable[[T], Iterable[T]][source]

The Kleisli composition operator (right-to-left version).

For two functions, kleisli_compose(g, f) returns:

lambda x: (z for y in f(x) for z in g(y))

In general this is, reduce(_compose, fs, lambda x: [x]); where _compose is the lambda for two arguments.

Note

Despite name (Kleisli), Python does not have a true Monad type-class. So this function works with functions taking a single argument and returning an iterator – it also works with iterables.

New in version 1.9.6.

Changed in version 1.9.7: Name changed to kleisli_compose.

Warning

You may want to use kleisli_compose_foldl() which matches the order semantics of the functional kleisli composition >=>.

xotl.tools.fp.iterators.kleisli_compose_foldl(*fs) → Callable[[T], Iterable[T]][source]

Same as kleisli_compose() but composes left-to-right.

Examples:

>>> s15 = lambda s: tuple(s + str(i) for i in range(1, 5))
>>> s68 = lambda s: tuple(s + str(i) for i in range(6, 8))
>>> # kleisli_compose produces "6" >>= 1, 2, 3, 4; and then "7" >>= 1, 2, 3, 4
>>> list(kleisli_compose(s15, s68)(""))
['61', '62', '63', '64', '71', '72', '73', '74']
>>> list(kleisli_compose_foldl(s15, s68)(""))
['16', '17', '26', '27', '36', '37', '46', '47']

If the operation is non-commutative (as the string concatenation) you end up with very different results.

>>> n15 = lambda s: tuple(s + i for i in range(1, 5))
>>> n68 = lambda s: tuple(s + i for i in range(6, 8))
>>> list(kleisli_compose(n15, n68)(0))
[7, 8, 9, 10, 8, 9, 10, 11]
>>> list(kleisli_compose_foldl(n15, n68)(0))
[7, 8, 8, 9, 9, 10, 10, 11]

If the operation is commutative you get the same set of results, but the order may be different.

xotl.tools.fp.iterators.iter_compose()

Deprecated since version 1.9.7: Alias of xotl.tools.fp.iterators.kleisli_compose()

xotl.tools.fp.option - Functional Programming Option Type

Functional Programming Option Type definition.

In Programming, and Type Theory, an option type, or maybe type, represents encapsulation of an optional value; e.g., it is used in functions which may or may not return a meaningful value when they are applied.

It consists of either a constructor encapsulating the original value x (written Just x or Some x) or an empty constructor (called None or Nothing). Outside of functional programming, these are known as nullable types.

In our case option type will be the Maybe class (the equivalent of Option in Scala Programming Language), the wrapper for valid values will be the Just class (equivalent of Some in Scala); and the wrapper for invalid values will be the Wrong class.

Instead of None or Nothing, Wrong is used because two reasons: (1) already existence of None special Python value, and (2) Wrong also wraps incorrect values and can have several instances (not only a null value).

class xotl.tools.fp.option.Just(*args)[source]

A wrapper for valid results.

class xotl.tools.fp.option.Maybe(*args)[source]

Wrapper for optional values.

The Maybe type encapsulates an optional value. A value of type Maybe a either contains a value of type a (represented as Just a), or it is empty (represented as Nothing). Using Maybe` is a good way to deal with errors or exceptional cases without resorting to drastic measures such as error. In this implementation we make a variation where a Wrong object represents a missing (with special value Nothing) or an improper value (including errors).

See descendant classes Just and Wrong for more information.

This implementation combines Maybe and Either Haskell data types. Maybe is a means of being explicit that you are not sure that a function will be successful when it is executed. Conventionally, the usage of Either for errors uses Right when the computation is successful, and Left for failing scenarios.

In this implementation, Just:class` us used for equivalence with both Haskell Just and Right types; Wrong is used with the special value Nothing and to encapsulate errors or incorrect values (Haskell Left).

Haskell:

data Maybe a = Nothing | Just a

either :: (a -> c) -> (b -> c) -> Either a b -> c

Case analysis for the Either type. If the value is Left a, apply the first function to a; if it is Right b, apply the second function to b.

classmethod choose(*types)[source]

Decorator to force Maybe values constraining to expecting types.

For example, a function that return a collection (tuple or list) if valid or False if not, if not decorated could be ambiguous for an empty collection:

>>> @Just.choose(tuple, list)
... def check_range(values, min, max):
...     if isinstance(values, (tuple, list)):
...         return [v for v in values if min <= v <= max]
...     else:
...         return False

>>> check_range(range(10), 7, 17)
[7, 8, 9]

>>> check_range(range(10), 17, 27)
Just([])

>>> check_range(set(range(10)), 7, 17)
False
classmethod compel(value)[source]

Coerce to the correspondent logical Boolean value.

Just is logically true, and Wrong is false.

For example:

>>> Just.compel([1])
[1]

>>> Just.compel([])
Just([])

>>> Wrong.compel([1])
Wrong([1])

>>> Wrong.compel([])
[]
class xotl.tools.fp.option.Wrong(*args)[source]

A wrapper for invalid results.

xotl.tools.fp.option.false = Wrong(False)

A Wrong special singleton encapsulating the False value.

xotl.tools.fp.option.none = Wrong(None)

A Wrong special singleton encapsulating the None value.

xotl.tools.fp.option.take(value)[source]

Extract a value.

xotl.tools.fp.option.true = Just(True)

A Just special singleton encapsulating the True value.

Further Notes

It could be thought that this kind of concept is useless in Python because the dynamic nature of the language, but always there are certain logic systems that need to wrap “correct” false values and “incorrect” true values.

Also, in functional programming, errors can be reasoned in a new way: more like as error values than in exception handling. Where the Maybe type expresses the failure possibility through Wrong instances encapsulating errors.

When receiving a Wrong instance encapsulating an error, and want to recover the exception propagation style -instead of continue in pure functional programming-, to re-raise the exception, instead the raise Python statement, use throw().

See https://en.wikipedia.org/wiki/Monad_%28functional_programming%29#The_Maybe_monad

xotl.tools.fp.prove - Prove validity of values

Proving success or failure of a function call has two main patterns:

  1. Predicative: a function call returns one or more values indicating a failure, for example method find in strings returns -1 if the sub-string is not found. In general this pattern considers a set of values as logical Boolean true, an other set false.

    Example:

    index = s.find('x')
    if index >= 0:
        ...    # condition of success
    else:
        ...    # condition of failure
    
  2. Disruptive: a function call throws an exception on a failure breaking the normal flow of execution, for example method index in strings.

    Example:

    try:
        index = s.index('x)
    except ValueError:
        ...    # condition of failure
    else:
        ...    # condition of success
    

    The exception object contains the semantics of the “”anomalous condition”. Exception handling can be used as flow control structures for execution context inter-layer processing, or as a termination condition.

Module content

Validity proofs for data values.

There are some basic helper functions:

  • predicative() wraps a function in a way that a logical false value is returned on failure. If an exception is raised, it is returned wrapped as an special false value. See Maybe monad for more information.
  • vouch() wraps a function in a way that an exception is raised if an invalid value (logical false by default) is returned. This is useful to call functions that use “special” false values to signal a failure.
  • enfold() creates a decorator to convert a function to use either the predicative() or the vouch() protocol.

New in version 1.8.0.

xotl.tools.fp.prove.enfold(checker)[source]

Create a decorator to execute a function inner a safety wrapper.

Parameters:checker – Could be any function to enfold, but it’s intended mainly for predicative() or vouch() functions.

In the following example, the semantics of this function can be seen. The definition:

>>> @enfold(predicative)
... def test(x):
...     return 1 <= x <= 10

>>> test(5)
5

It is equivalent to:

>>> def test(x):
...     return 1 <= x <= 10

>>> predicative(test, 5)
5

In other hand:

>>> @enfold(predicative)
... def test(x):
...     return 1 <= x <= 10

>>> test(15)
5
xotl.tools.fp.prove.predicative(function, *args, **kwds)[source]

Call a function in a safety wrapper returning a false value if fail.

This converts any function into a predicate. A predicate can be thought as an operator or function that returns a value that is either true or false.

Predicates are sometimes used to indicate set membership: on certain occasions it is inconvenient or impossible to describe a set by listing all of its elements. Thus, a predicate P(x) will be true or false, depending on whether x belongs to a set.

If the argument function validates its arguments, return a valid true value. There are two special conditions: first, a value treated as false for Python conventions (for example, 0, or an empty string); and second, when an exception is raised; in both cases the predicate will return an instance of Maybe.

xotl.tools.fp.prove.vouch(function, *args, **kwds)[source]

Call a function in a safety wrapper raising an exception if it fails.

When the wrapped function fails, an exception must be raised. A predicate fails when it returns a false value. To avoid treat false values of some types as fails, use Just to return that values wrapped.

xotl.tools.fp.tools – High-level pure function tools

Tools for working with functions in a more “pure” way.

class xotl.tools.fp.tools.compose(*funcs)[source]

Composition of several functions.

Functions are composed right to left. A composition of zero functions gives back the identity() function.

Rules must be fulfilled (those inner all):

>>> x = 15
>>> f, g, h = x.__add__, x.__mul__, x.__xor__
>>> all((compose() is identity,
...
...      # identity functions are optimized
...      compose(identity, f, identity) is f,
...
...      compose(f) is f,
...      compose(g, f)(x) == g(f(x)),
...      compose(h, g, f)(x) == h(g(f(x)))))
True

If any “intermediate” function returns an instance of:

  • pos_args: it’s expanded as variable positional arguments to the next function.
  • kw_args: it’s expanded as variable keyword arguments to the next function.
  • full_args: it’s expanded as variable positional and keyword arguments to the next function.

The expected usage of these is not to have function return those types directly, but to use them when composing functions that return tuples and expect tuples.

xotl.tools.fp.tools.identity(arg)[source]

Returns its argument unaltered.

xotl.tools.fp.tools.fst(pair, strict=True)[source]

Return the first element of pair.

If strict is True, pair needs to unpack to exactly two values. If strict is False this is the same as pair[0].

Note

This is an idiomatic function intended for using in compositions or as the argument or high-level functions. Don’t use it in your code as a replacement of x[0].

New in version 1.8.5.

xotl.tools.fp.tools.snd(pair, strict=True)[source]

Return the second element of pair.

If strict is True, pair needs to unpack to exactly two values. If strict is False this is the same as pair[1].

Note

This is an idiomatic function intended for using in compositions or as the argument or high-level functions. Don’t use it in your code as a replacement of x[1].

New in version 1.8.5.

xotl.tools.fp.tools.constant(value)[source]

Return a function that always return a constant value.

class xotl.tools.fp.tools.pos_args[source]

Mark variable number positional arguments (see full_args).

class xotl.tools.fp.tools.kw_args[source]

Mark variable number keyword arguments (see full_args).

class xotl.tools.fp.tools.full_args[source]

Mark variable number arguments for composition.

Pair containing positional and keyword (args, kwds) arguments.

In standard functional composition, the result of a function is considered a single value to be use as the next function argument. You can override this behaviour returning one instance of pos_args, kw_args, or this class; in order to provide multiple arguments to the next call.

Since types are callable, you may use it directly in compose() instead of changing your functions to returns the instance of one of these classes:

>>> def join_args(*args):
...     return ' -- '.join(str(arg) for arg in args)

>>> compose(join_args, pos_args, list, range)(2)
'0 -- 1'

# Without 'pos_args', it prints the list
>>> compose(join_args, list, range)(2)
'[0, 1]'

xotl.tools.fs – file system utilities

File system utilities.

This module contains file-system utilities that could have side-effects. For path-handling functions that have no side-effects look at xotl.tools.fs.path.

xotl.tools.fs.ensure_filename(filename, yields=False)[source]

Ensures the existence of a file with a given filename.

If the filename is taken and is not pointing to a file (or a link to a file) an OSError is raised. If exist_ok is False the filename must not be taken; an OSError is raised otherwise.

The function creates all directories if needed. See makedirs() for restrictions.

If yields is True, returns the file object. This way you may open a file for writing like this:

with ensure_filename('/tmp/good-name-87.txt', yields=True) as fh:
    fh.write('Do it!')

The file is open in mode ‘w+b’.

New in version 1.6.1: Added parameter yield.

xotl.tools.fs.imap(func, pattern)[source]

Yields func(file_0, stat_0), func(file_1, stat_1), … for each dir path. The pattern may contain:

  • Simple shell-style wild-cards à la fnmatch.
  • Regex if pattern starts with ‘(?’. Expressions must be valid, as in “(?:[^.].*)$” or “(?i).*.jpe?g$”. Remember to add the end mark ‘$’ if needed.
xotl.tools.fs.iter_dirs(top='.', pattern=None, regex_pattern=None, shell_pattern=None)[source]

Iterate directories recursively.

The params have analagous meaning that in iter_files() and the same restrictions.

xotl.tools.fs.iter_files(top='.', pattern=None, regex_pattern=None, shell_pattern=None, followlinks=False, maxdepth=None)[source]

Iterate filenames recursively.

Parameters:
  • top – The top directory for recurse into.
  • pattern – A pattern of the files you want to get from the iterator. It should be a string. If it starts with “(?” it will be regarded as a regular expression, otherwise a shell pattern.
  • regex_pattern – An alternative to pattern. This will always be regarded as a regular expression.
  • shell_pattern – An alternative to pattern. This should be a shell pattern.
  • followlinks – The same meaning that in os.walk.
  • maxdepth – Only files above this level will be yielded. If None, no limit is placed.

Warning

It’s an error to pass more than pattern argument.

Changed in version 1.2.1: Added parameters followlinks and maxdepth.

xotl.tools.fs.listdir(path)[source]

Same as os.listdir but normalizes path and raises no error.

xotl.tools.fs.rmdirs(top='.', pattern=None, regex_pattern=None, shell_pattern=None, exclude=None, confirm=None)[source]

Removes all empty dirs at top.

Parameters:
  • top – The top directory to recurse into.
  • pattern – A pattern of the dirs you want to remove. It should be a string. If it starts with “(?” it will be regarded as a regular expression, otherwise a shell pattern.
  • exclude – A pattern of the dirs you DON’T want to remove. It should be a string. If it starts with “(?” it will be regarded as a regular expression, otherwise a shell pattern. This is a simple commodity to have you not to negate complex patterns.
  • regex_pattern – An alternative to pattern. This will always be regarded as a regular expression.
  • shell_pattern – An alternative to pattern. This should be a shell pattern.
  • confirm – A callable that accepts a single argument, which is the path of the directory to be deleted. confirm should return True to allow the directory to be deleted. If confirm is None, then all matched dirs are deleted.

Note

In order to avoid common mistakes we won’t attempt to remove mount points.

New in version 1.1.3.

xotl.tools.fs.stat(path)[source]

Return file or file system status.

This is the same as the function os.stat but raises no error.

xotl.tools.fs.walk_up(start, sentinel)[source]

Given a start directory walk-up the file system tree until either the FS root is reached or the sentinel is found.

The sentinel must be a string containing the file name to be found.

Warning

If sentinel is an absolute path that exists this will return start, no matter what start is (in windows this could be even different drives).

If start path exists but is not a directory an OSError is raised.

xotl.tools.fs.concatfiles(*files, target)[source]

Concat several files to a single one.

Each positional argument must be either:

The last positional argument is the target. If it’s file-like object it must be open for writing, and the caller is the responsible for closing it.

Alternatively if there are only two positional arguments and the first is a collection, the sources will be the members of the first argument.

xotl.tools.fs.makedirs(path, mode=0o777, exist_ok=False)[source]

Recursive directory creation function. Like os.mkdir(), but makes all intermediate-level directories needed to contain the leaf directory.

The default mode is 0o777 (octal). On some systems, mode is ignored. Where it is used, the current umask value is first masked out.

If exist_ok is False (the default), an OSError is raised if the target directory already exists.

Note

makedirs() will become confused if the path elements to create include os.pardir (eg. “..” on UNIX systems).

This function handles UNC paths correctly.

Changed in version 1.6.1: Behaves as Python 3.4.1.

Before Python 3.4.1 (ie. xotl.tools 1.6.1), if exist_ok was True and the directory existed, makedirs() would still raise an error if mode did not match the mode of the existing directory. Since this behavior was impossible to implement safely, it was removed in Python 3.4.1. See the original os.makedirs().

Contents:

xotl.tools.fs.path – Path utilities

Extensions to os.path

Functions inside this module must not have side-effects on the file-system. This module re-exports (without change) several functions from the os.path standard module.

xotl.tools.fs.path.join(base, *extras)[source]

Join two or more pathname components, inserting ‘/’ as needed.

If any component is an absolute path, all previous path components will be discarded.

Normalize path (after join parts), eliminating double slashes, etc.

xotl.tools.fs.path.fix_encoding(name, encoding=None)[source]

Fix encoding of a file system resource name.

encoding is ignored if name is already a str.

xotl.tools.fs.path.normalize_path(base, *extras)[source]

Normalize path by:

  • expanding ‘~’ and ‘~user’ constructions.
  • eliminating double slashes
  • converting to absolute.
xotl.tools.fs.path.shorten_module_filename(filename)[source]

A filename, normally a module o package name, is shortened looking his head in all python path.

xotl.tools.fs.path.shorten_user(filename)[source]

A filename is shortened looking for the (expantion) $HOME in his head and replacing it by ‘~’.

xotl.tools.fs.path.rtrim(path, n=1)[source]

Trims the last n components of the pathname path.

This basically applies n times the function os.path.dirname to path.

path is normalized before proceeding (but not tested to exists).

Changed in version 1.5.5: n defaults to 1. In this case rtrim is identical to os.path.dirname().

Example:

>>> rtrim('/tmp/a/b/c/d', 3)
'/tmp/a'

# It does not matter if `/` is at the end
>>> rtrim('/tmp/a/b/c/d/', 3)
'/tmp/a'

xotl.tools.future - Extend standard modules with “future” features

Extend standard modules including “future” features in current versions.

Version 3 introduce several concepts in standard modules. Sometimes these features are implemented in the evolution of 2.7.x versions. By using sub-modules, these differences can be avoided transparently. For example, you can import xotl.tools.future.collections.UserDict in any version, that it’s equivalent to Python 3 collections.UserDict, but it don’t exists in Python 2.

New in version 1.7.2.

Contents

xotl.tools.future.codecs - Codec registry, base classes and tools

This module extends the standard library’s functools. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since could be different in Python 2.7 and Python 3.3.

We added the following features.

xotl.tools.future.codecs.force_encoding(encoding=None)[source]

Validates an encoding value.

If encoding is None use locale.getdefaultlocale(). If that is also none, return ‘UTF-8’.

New in version 1.2.0.

Changed in version 1.8.0: migrated to ‘future.codecs’

Changed in version 1.8.7: Stop using locale.getpreferrededencoding() and improve documentation.

xotl.tools.future.codecs.safe_decode(s, encoding=None)[source]

Similar to bytes decode method returning unicode.

Decodes s using the given encoding, or determining one from the system.

Returning type depend on python version; if 2.x is unicode if 3.x str.

New in version 1.1.3.

Changed in version 1.8.0: migrated to ‘future.codecs’

xotl.tools.future.codecs.safe_encode(u, encoding=None)[source]

Similar to unicode encode method returning bytes.

Encodes u using the given encoding, or determining one from the system.

Returning type is always bytes; but in python 2.x is also str.

New in version 1.1.3.

Changed in version 1.8.0: migrated to ‘future.codecs’

xotl.tools.future.collections - High-performance container datatypes

This module extends the standard library’s collections. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since this is different in Python 2.7 and Python 3.3. Notably importing abc is not available in Python 2.7.

We have backported several Python 3.3 features but not all.

class xotl.tools.future.collections.defaultdict[source]

A hack for collections.defaultdict that passes the key and a copy of self as a plain dict (to avoid infinity recursion) to the callable.

Examples:

>>> from xotl.tools.future.collections import defaultdict
>>> d = defaultdict(lambda key, d: 'a')
>>> d['abc']
'a'

Since the second parameter is actually a dict-copy, you may (naively) do the following:

>>> d = defaultdict(lambda k, d: d[k])
>>> d['abc']
Traceback (most recent call last):
    ...
KeyError: 'abc'

You may use this class as a drop-in replacement for collections.defaultdict:

>>> d = defaultdict(lambda: 1)
>>> d['abc']
1
class xotl.tools.future.collections.opendict[source]

A dictionary implementation that mirrors its keys as attributes.

For example:

>>> d = opendict(es='spanish')
>>> d.es
'spanish'

>>> d['es'] = 'espanol'
>>> d.es
'espanol'

Setting attributes not already included does not makes them keys:

>>> d.en = 'English'
>>> set(d)
{'es'}
classmethod from_enum(enumclass)[source]

Creates an opendict from an enumeration class.

If enumclass lacks the __members__ dictionary, take the __dict__ of the class disregarding the keys that cannot be public identifiers. If enumclass has the __members__ attribute this is the same as opendict(enumclass.__members__).

Example:

>>> from xotl.tools.future.collections import opendict
>>> @opendict.from_enum
>>> class Foo:
...    x = 1
...    _y = 2

>>> type(Foo) is opendict
True

>>> dict(Foo)
{'x': 1}
class xotl.tools.future.collections.codedict[source]

A dictionary implementation that evaluate keys as Python expressions.

This is also a open dict (see OpenDictMixin for more info).

Example:

>>> cd = codedict(x=1, y=2, z=3.0)
>>> '{_[x + y]} is 3 --  {_[x + z]} is 4.0'.format(_=cd)
'3 is 3 --  4.0 is 4.0'

It supports the right shift (>>) operator as a format operand (using _ as the special name for the code dict):

>>> cd >> '{_[x + y]} is 3 --  {_[x + z]} is 4.0 -- {x} is 1'
'3 is 3 --  4.0 is 4.0 -- 1 is 1'

It also implements the left shift (<<) operator:

>>> '{_[x + y]} is 3 --  {_[x + z]} is 4.0 -- {x} is 1' << cd
'3 is 3 --  4.0 is 4.0 -- 1 is 1'

New in version 1.8.3.

class xotl.tools.future.collections.Counter(**kwds)[source]

Dict subclass for counting hashable items. Sometimes called a bag or multiset. Elements are stored as dictionary keys and their counts are stored as dictionary values.

>>> c = Counter('abcdeabcdabcaba')  # count elements from a string
>>> c.most_common(3)                # three most common elements
[('a', 5), ('b', 4), ('c', 3)]
>>> sorted(c)                       # list all unique elements
['a', 'b', 'c', 'd', 'e']
>>> ''.join(sorted(c.elements()))   # list elements with repetitions
'aaaaabbbbcccdde'
>>> sum(c.values())                 # total of all counts
15
>>> c['a']                          # count of letter 'a'
5
>>> for elem in 'shazam':           # update counts from an iterable
...     c[elem] += 1                # by adding 1 to each element's count
>>> c['a']                          # now there are seven 'a'
7
>>> del c['b']                      # remove all 'b'
>>> c['b']                          # now there are zero 'b'
0
>>> d = Counter('simsalabim')       # make another counter
>>> c.update(d)                     # add in the second counter
>>> c['a']                          # now there are nine 'a'
9
>>> c.clear()                       # empty the counter
>>> c
Counter()

Note: If a count is set to zero or reduced to zero, it will remain in the counter until the entry is deleted or the counter is cleared:

>>> c = Counter('aaabbc')
>>> c['b'] -= 2                     # reduce the count of 'b' by two
>>> c.most_common()                 # 'b' is still in, but its count is zero
[('a', 3), ('c', 1), ('b', 0)]

Note

Backported from Python 3.3. In Python 3.3 this is an alias.

class xotl.tools.future.collections.OrderedDict[source]

Dictionary that remembers insertion order

Note

Backported from Python 3.3. In Python 3.3 this is an alias.

class xotl.tools.future.collections.OpenDictMixin[source]

A mixin for mappings implementation that expose keys as attributes.

For example:

>>> from xotl.tools.objects import SafeDataItem as safe
>>> class MyOpenDict(OpenDictMixin, dict):
...     __slots__ = safe.slot(OpenDictMixin.__cache_name__, dict)
>>> d = MyOpenDict({'es': 'spanish'})
>>> d.es
'spanish'
>>> d['es'] = 'espanol'
>>> d.es
'espanol'

When setting or deleting an attribute, the attribute name is regarded as key in the mapping if neither of the following condition holds:

  • The name is a slot.
  • The object has a __dict__ attribute and the name is key there.

This mixin defines the following features that can be redefined:

_key2identifier

Protected method, receives a key as argument and must return a valid identifier that is used instead the key as an extended attribute.

__cache_name__

Inner field to store a cached mapping between actual keys and calculated attribute names. The field must be always implemented as a SafeDataItem descriptor and must be of type dict. There are two ways of implementing this:

  • As a slot. The first time of this implementation is an example. Don’t forget to pass the second parameter with the constructor dict.

  • As a normal descriptor:

    >>> from xotl.tools.objects import SafeDataItem as safe
    >>> class MyOpenDict(OpenDictMixin, dict):
    ...     safe(OpenDictMixin.__cache_name__, dict)
    

Classes or Mixins that can be integrated with dict by inheritance must not have a __slots__ definition. Because of that, this mixin must not declare any slot. If needed, it must be declared explicitly in customized classed like in the example in the first part of this documentation or in the definition of opendict class.

class xotl.tools.future.collections.OrderedSmartDict(**kwds)[source]

A combination of the OrderedDict with the SmartDictMixin.

Warning

Initializing with kwargs does not ensure any initial ordering, since Python’s keyword dict is not ordered. Use a list/tuple of pairs instead.

class xotl.tools.future.collections.SmartDictMixin[source]

A mixin that extends the update method of dictionaries

Standard method allow only one positional argument, this allow several.

Note on using mixins in Python: method resolution order is calculated in the order of inheritance, if a mixin is defined to overwrite behavior already existent, use first that classes with it. See SmartDict below.

class xotl.tools.future.collections.StackedDict(**kwargs)[source]

A multi-level mapping.

A level is entered by using the push() and is leaved by calling pop().

The property level returns the actual number of levels.

When accessing keys they are searched from the latest level “upwards”, if such a key does not exists in any level a KeyError is raised.

Deleting a key only works in the current level; if it’s not defined there a KeyError is raised. This means that you can’t delete keys from the upper levels without popping.

Setting the value for key, sets it in the current level.

Changed in version 1.5.2: Based on the newly introduced ChainMap.

pop()[source]

A deprecated alias for pop_level().

Deprecated since version 1.7.0.

push(*args, **kwargs)[source]

A deprecated alias for push_level().

Deprecated since version 1.7.0.

level

Return the current level number.

The first level is 0. Calling push() increases the current level (and returns it), while calling pop() decreases the current level (if possible).

peek()[source]

Peeks the top level of the stack.

Returns a copy of the top-most level without any of the keys from lower levels.

Example:

>>> sdict = StackedDict(a=1, b=2)
>>> sdict.push(c=3)  # it returns the level...
1
>>> sdict.peek()
{'c': 3}
pop_level()[source]

Pops the last pushed level and returns the whole level.

If there are no levels in the stacked dict, a TypeError is raised.

Returns:A dict containing the poped level.
push_level(*args, **kwargs)[source]

Pushes a whole new level to the stacked dict.

Parameters:
  • args – Several mappings from which the new level will be initialled filled.
  • kwargs – Values to fill the new level.
Returns:

The pushed level number.

class xotl.tools.future.collections.ChainMap(*maps)[source]

A ChainMap groups multiple dicts or other mappings together to create a single, updateable view. If no maps are specified, a single empty dictionary is provided so that a new chain always has at least one mapping.

The underlying mappings are stored in a list. That list is public and can accessed or updated using the maps attribute. There is no other state.

Lookups search the underlying mappings successively until a key is found. In contrast, writes, updates, and deletions only operate on the first mapping.

A ChainMap incorporates the underlying mappings by reference. So, if one of the underlying mappings gets updated, those changes will be reflected in ChainMap.

All of the usual dictionary methods are supported. In addition, there is a maps attribute, a method for creating new subcontexts, and a property for accessing all but the first mapping:

maps

A user updateable list of mappings. The list is ordered from first-searched to last-searched. It is the only stored state and can be modified to change which mappings are searched. The list should always contain at least one mapping.

new_child(m=None)[source]

Returns a new ChainMap containing a new map followed by all of the maps in the current instance. If m is specified, it becomes the new map at the front of the list of mappings; if not specified, an empty dict is used, so that a call to d.new_child() is equivalent to: ChainMap({}, *d.maps). This method is used for creating subcontexts that can be updated without altering values in any of the parent mappings.

Changed in version 1.5.5: The optional m parameter was added.

parents

Property returning a new ChainMap containing all of the maps in the current instance except the first one. This is useful for skipping the first map in the search. Use cases are similar to those for the nonlocal keyword used in nested scopes. A reference to d.parents is equivalent to: ChainMap(*d.maps[1:]).

Note

Backported from Python 3.4. In Python 3.4 this is an alias.

class xotl.tools.future.collections.PascalSet(*others)[source]

Collection of unique integer elements (implemented with intervals).

PascalSet(*others) -> new set object

New in version 1.7.1.

class xotl.tools.future.collections.BitPascalSet(*others)[source]

Collection of unique integer elements (implemented with bit-wise sets).

BitPascalSet(*others) -> new bit-set object

New in version 1.7.1.

xotl.tools.future.contextlib - Utilities for with-statement contexts

Utilities for with-statement contexts.

This module re-export all symbols from the standard library, with the exception of the function nested.

New in version 1.9.5.

The main reason to use this module is to stop using the nested() function and use the ExitStack implementation.

xotl.tools.future.csv - CSV parsing and writing extensions

New in version 1.8.4.

This module extends the standard library’s csv. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since could be different in Python 2.7 and Python 3.3.

We added the following features.

class xotl.tools.future.csv.unix_dialect[source]

Describe the usual properties of Unix-generated CSV files.

Added only in Python 2 for compatibility purposes.

xotl.tools.future.csv.parse(data, *dialect, **options)[source]

Parse data into a sheet.

This function has the exact parameters protocol as reader()

parse(data [, dialect='excel'] [, optional keyword options])
Parameters:
  • data – Can be any object that returns a line of input for each iteration, such as a file object or a list.
  • dialect – An optional parameter can be given which is used to define a set of parameters specific to a particular CSV dialect. It may be an instance of a subclass of the Dialect class or one of the strings returned by the list_dialects() function.

The other optional keyword arguments can be given to override individual formatting parameters in the current dialect.

When reading a value, csv for Python version 2 doesn’t accept unicode text, so given data lines are forced to be str values before processed by reader(). Each cell is converted to unicode text after read.

Returns:The parsed matrix.

A short usage example:

>>> from xotl.tools.future import csv
>>> with open('test.csv', newline='') as data:
...     matrix = csv.parse(data)
...     for row in matrix:
...         print(', '.join(row))
Last name, First Name
van Rossum, Guido
Stallman, Richard
xotl.tools.future.datetime - Basic date and time types

This module extends the standard library’s datetime. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since could be different in Python 2.7 and Python 3.3.

In Pytnon versions <= 3 date format fails for several dates, for example date(1800, 1, 1).strftime("%Y"). So, classes date and datetime are redefined if that case.

This problem could be solved by redefining the strftime function in the time module, because it is used for all strftime methods; but (WTF), Python double checks the year (in each method and then again in time.strftime function).

xotl.tools.future.datetime.assure(obj)[source]

Make sure that a date or datetime instance is a safe version.

This is only a type checker alternative to standard library.

We added the following features.

xotl.tools.future.datetime.strfdelta(delta)[source]

Format a timedelta using a smart pretty algorithm.

Only two levels of values will be printed.

>>> def t(h, m):
...     return timedelta(hours=h, minutes=m)

>>> strfdelta(t(4, 56)) == '4h 56m'
True
xotl.tools.future.datetime.strftime(dt, fmt)[source]

Used as strftime method of date and datetime redefined classes.

Also could be used with standard instances.

xotl.tools.future.datetime.get_month_first(ref=None)[source]

Given a reference date, returns the first date of the same month. If ref is not given, then uses current date as the reference.

xotl.tools.future.datetime.get_month_last(ref=None)[source]

Given a reference date, returns the last date of the same month. If ref is not given, then uses current date as the reference.

xotl.tools.future.datetime.get_next_month(ref=None, lastday=False)[source]

Get the first or last day of the next month.

If lastday is False return the first date of the next month. Otherwise, return the last date.

The next month is computed with regards to a reference date. If ref is None, take the current date as the reference.

Examples:

>>> get_next_month(date(2017, 1, 23))
date(2017, 2, 1)
>>> get_next_month(date(2017, 1, 23), lastday=True)
date(2017, 2, 28)

New in version 1.7.3.

xotl.tools.future.datetime.is_full_month(start, end)[source]

Returns true if the arguments comprises a whole month.

class xotl.tools.future.datetime.flextime[source]
xotl.tools.future.datetime.daterange([start, ]stop[, step])[source]

Similar to standard ‘range’ function, but for date objets.

Returns an iterator that yields each date in the range of [start, stop), not including the stop.

If start is given, it must be a date (or datetime) value; and in this case only stop may be an integer meaning the numbers of days to look ahead (or back if stop is negative).

If only stop is given, start will be the first day of stop’s month.

step, if given, should be a non-zero integer meaning the numbers of days to jump from one date to the next. It defaults to 1. If it’s positive then stop should happen after start, otherwise no dates will be yielded. If it’s negative stop should be before start.

As with range, stop is never included in the yielded dates.

class xotl.tools.future.datetime.DateField(name, nullable=False)[source]

A simple descriptor for dates.

Ensures that assigned values must be parseable dates and parses them.

class xotl.tools.future.datetime.TimeSpan(start_date=None, end_date=None)[source]

A continuous span of time.

Time spans objects are iterable. They yield exactly two times: first the start date, and then the end date:

>>> ts = TimeSpan('2017-08-01', '2017-09-01')
>>> tuple(ts)
(date(2017, 8, 1), date(2017, 9, 1))

Time spans objects have two items:

>>> ts[0]
date(2017, 8, 1)

>>> ts[1]
date(2017, 9, 1)

>>> ts[:]
(date(2017, 8, 1), date(2017, 9, 1))

Two time spans are equal if their start_date and end_date are equal. When comparing a time span with a date, the date is coerced to a time span (from_date()).

Note

Comparing time spans with date time spans coerces the time span before comparing.

A time span with its start set to None is unbound to the past. A time span with its end set to None is unbound to the future. A time span that is both unbound to the past and the future contains all possible dates. A time span that is not unbound in any direction is bound.

A bound time span is valid if its start date comes before its end date. Unbound time spans are always valid.

Time spans can intersect, compared for containment of dates and by the subset/superset order operations (<=, >=). In this regard, they represent the set of dates between start and end, inclusively.

Warning

Time spans don’t implement the union or difference operations expected in sets because the difference/union of two span is not necessarily continuous.

classmethod from_date(date: datetime.date) → xotl.tools.future.datetime.TimeSpan[source]

Return a new time span that covers a single date.

past_unbound

True if the time span is not bound into the past.

future_unbound

True if the time span is not bound into the future.

unbound

True if the time span is unbound into the past or unbount into the future or both.

bound

True if the time span is not unbound.

valid

A bound time span is valid if it starts before it ends.

Unbound time spans are always valid.

__le__(other)[source]

True if other is a superset.

issubset()

An alias for __le__().

__ge__(other)[source]

True if other is a subset.

issuperset()

An alias for __ge__().

covers()

An alias for __ge__().

isdisjoint(other)[source]
overlaps(other)[source]

Test if the time spans overlaps.

__contains__(other)[source]

Test date other is in the time span.

__and__(other)[source]

Get the time span that is the intersection with another time span.

If two time spans don’t overlap, return EmptyTimeSpan.

If other is not a TimeSpan we try to create one. If other is a date, we create the TimeSpan that starts and end that very day. Other types are passed unchanged to the constructor.

When other is a DateTimeSpan, convert self to a date time span before doing the intersection.

__mul__()

An alias for __and__().

intersection(*others)[source]

Return self [& other1 & ...].

__lshift__(delta)[source]

Return the time span displaced to the past in delta.

Parameters:delta – The number of days to displace. It can be either an integer or a datetime.timedelta. The integer will be converted to timedelta(days=delta).

Note

Delta values that don’t amount to at least a day will be the same as 0.

New in version 1.8.2.

Warning

Python does have a boundaries for the dates it can represent, so displacing a TimeSpan can cause OverflowError.

__rshift__(delta)[source]

Return the time span displaced to the future in delta.

Parameters:delta – The number of days to displace. It can be either an integer or a datetime.timedelta. The integer will be converted to timedelta(days=delta).

Note

Delta values that don’t amount to at least a day will be the same as 0.

New in version 1.8.2.

Warning

Python does have a boundaries for the dates it can represent, so displacing a TimeSpan can cause OverflowError.

__len__()[source]

The amount of dates in the span.

Warning

If the time span is unbound this method returns NotImplemented. This will make python complain with a TypeError.

New in version 1.8.2.

diff(other)[source]

Return the two time spans which (combined) contain all the dates in self which are not in other.

Notice this method returns a tuple of exactly two items.

If other and self don’t overlap, return (self, EmptyTimeSpan).

If self <= other is True, return the tuple with the empty time span in both positions.

Otherwise self will have some dates which are not in other; there are possible three cases:

  1. other starts before or at self’s start date; return the empty time span and the time span containing the dates after other.end_date up to self.end_date
  2. other ends at or after self’s end date; return the dates from self.start_date up to the date before other.start_date and the empty time span.
  3. other is fully contained in self; return two non-empty time spans as in the previous cases.

New in version 1.9.7.

class xotl.tools.future.datetime.DateTimeSpan(start_datetime=None, end_datetime=None)[source]

A continuous span of time (with datetime at each boundary).

DateTimeSpan is a minor extension of TimeSpan, and is a subclass.

DateTimeSpan objects are iterable. They yield exactly two datetimes: first the start date, and then the end date:

>>> ts = DateTimeSpan('2017-08-01 11:00', '2017-09-01 23:00')
>>> tuple(ts)
(datetime(2017, 8, 1, 11, 0), date(2017, 9, 1, 23, 0))

The API of DateTimeSpan is just the natural transformation of the API of TimeSpan.

The start_date and end_date attributes are interlocked with the start_datetime and end_datetime. By changing start_date, you also change start_datetime with the same date at 00:00 without tzinfo. By setting start_datetime you also update start_date. By setting end_date you also update end_datetime with the same date at 23:59:59 without tzinfo.

New in version 1.9.7.

Warning

DateTimeSpan is provided on a provisional basis. Future releases can change its API or remove it completely.

classmethod from_datetime(dt)[source]

Return a new date time span that covers a single datetime.

If dt is actually a date, the start_datetime will be at ‘00:00:00’ and the end_datetime will be at ‘23:59:59’.

classmethod from_timespan(ts)[source]

Return a new date time span from a timespan.

Notice the start datetime will be set at ‘00:00:00’ and the end datetime at ‘23:59:59’.

If ts is already a DateTimeSpan, return it unchanged.

past_unbound

True if the time span is not bound into the past.

future_unbound

True if the time span is not bound into the future.

unbound

True if the time span is unbound into the past or unbount into the future or both.

bound

True if the time span is not unbound.

valid

A bound time span is valid if it starts before it ends.

Unbound time spans are always valid.

__le__(other)[source]

True if other is a superset.

issubset()

An alias for __le__().

__ge__(other)[source]

True if other is a subset.

issuperset()

An alias for __ge__().

covers()

An alias for __ge__().

isdisjoint(other)[source]
overlaps(other)[source]

Test if the time spans overlaps.

__contains__(other)[source]

Test if datetime other is in the datetime span.

If other is a date, we convert it to a naive datetime at midnight (00:00:00).

__and__(other)[source]

Get the date time span that is the intersection with another time span.

If two time spans don’t overlap, return the object EmptyTimeSpan.

If other is not a DateTimeSpan we try to create one. If other is a date/datetime, we create use from_datetime(). If other is TimeSpan we use from_timespan(). Other types are passed unchanged to the constructor.

__mul__()

An alias for __and__().

intersection(*others)[source]

Return self [& other1 & ...].

__lshift__(delta)[source]

Return the date time span displaced to the past in delta.

Parameters:delta – The number of days to displace. It can be either an integer or a datetime.timedelta. The integer will be converted to timedelta(days=delta).

Warning

Python does have a boundaries for the dates it can represent, so displacing can cause OverflowError.

__rshift__(delta)[source]

Return the date time span displaced to the future in delta.

Parameters:delta – The number of days to displace. It can be either an integer or a datetime.timedelta. The integer will be converted to timedelta(days=delta).

Warning

Python does have a boundaries for the dates it can represent, so displacing can cause OverflowError.

__len__()

The amount of dates in the span.

Warning

If the time span is unbound this method returns NotImplemented. This will make python complain with a TypeError.

New in version 1.8.2.

diff(other)[source]

Return the two datetime spans which (combined) contain all the seconds in self which are not in other.

Notice this method returns a tuple of exactly two items.

If other and self don’t overlap, return (self, EmptyTimeSpan).

If self <= other is True, return the tuple with the empty time span in both positions.

Otherwise self will have some datetimes which are not in other; there are possible three cases:

  1. other starts before or at self’s start datetime; return the empty time span and the datetime span from the second after other.end_datetime up to self.end_datetime
  2. other ends at or after self’s end date; return the datetime span from self.start_datetime up to the second before other.start_datetime and the empty time span.
  3. other is fully contained in self; return two non-empty datetime spans as in the previous cases.
xotl.tools.future.datetime.EmptyTimeSpan

The empty time span. It’s not an instance of TimeSpan but engage set-like operations: union, intersection, etc.

No date is a member of the empty time span. The empty time span is a proper subset of any time span. It’s only a superset of itself. It’s not a proper superset of any other time span nor itself.

This instance is a singleton.

xotl.tools.future.functools - Higher-order functions and callable objects

This module extends the standard library’s functools. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since could be different in Python 2.7 and Python 3.3.

We added the following features.

xotl.tools.future.functools.power(*funcs, times)[source]

Returns the “power” composition of several functions.

Examples:

>>> import operator
>>> f = power(partial(operator.mul, 3), 3)
>>> f(23) == 3*(3*(3*23))
True

>>> power(operator.neg)
Traceback (most recent call last):
...
TypeError: power() takes at least 2 arguments (1 given)
class xotl.tools.future.functools.lwraps(f, n, *, name=None, doc=None, wrapped=None)[source]

Lambda wrapper.

Useful for decorate lambda functions with name and documentation.

As positional arguments could be passed the function to be decorated and the name in any order. So the next two identity definitions are equivalents:

>>> from xotl.tools.future.functools import lwraps as lw

>>> identity = lw('identity', lambda arg: arg)

>>> identity = lw(lambda arg: arg, 'identity')

As keyword arguments could be passed some special values, and any number of literal values to be assigned:

  • name: The name of the function (__name__); only valid if not given as positional argument.
  • doc: The documentation (__doc__ field).
  • wrapped: An object to extract all values not yet assigned. These values are (‘__module__’, ‘__name__’ and ‘__doc__’) to be assigned, and ‘__dict__’ to be updated.

If the function to decorate is present in the positional arguments, this same argument function is directly returned after decorated; if not a decorator is returned similar to standard wraps().

For example:

>>> from xotl.tools.future.functools import lwraps as lw

>>> is_valid_age = lw('is-valid-human-age', lambda age: 0 < age <= 120,
...                   doc=('A predicate to evaluate if an age is '
...                        'valid for a human being.')

>>> @lw(wrapped=is_valid_age)
... def is_valid_working_age(age):
...     return 18 < age <= 70

>>> is_valid_age(16)
True

>>> is_valid_age(200)
False

>>> is_valid_working_age(16)
False

New in version 1.7.0.

xotl.tools.future.functools.curry(f)[source]

Return a function that automatically ‘curries’ is positional arguments.

Example:

>>> add = curry(lambda x, y: x + y)
>>> add(1)(2)
3

>>> add(1, 2)
3

>>> add()()()(1, 2)
3

We have backported several Python 3.3 features but maybe not all.

xotl.tools.future.functools.update_wrapper(wrapper, wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES)[source]

Update a wrapper function to look like the wrapped function. The optional arguments are tuples to specify which attributes of the original function are assigned directly to the matching attributes on the wrapper function and which attributes of the wrapper function are updated with the corresponding attributes from the original function. The default values for these arguments are the module level constants WRAPPER_ASSIGNMENTS (which assigns to the wrapper function’s __name__, __module__, __annotations__ and __doc__, the documentation string) and WRAPPER_UPDATES (which updates the wrapper function’s __dict__, i.e. the instance dictionary).

To allow access to the original function for introspection and other purposes (e.g. bypassing a caching decorator such as lru_cache()), this function automatically adds a __wrapped__ attribute to the wrapper that refers to the original function.

The main intended use for this function is in decorator functions which wrap the decorated function and return the wrapper. If the wrapper function is not updated, the metadata of the returned function will reflect the wrapper definition rather than the original function definition, which is typically less than helpful.

update_wrapper() may be used with callables other than functions. Any attributes named in assigned or updated that are missing from the object being wrapped are ignored (i.e. this function will not attempt to set them on the wrapper function). AttributeError is still raised if the wrapper function itself is missing any attributes named in updated.

xotl.tools.future.inspect - Inspect live objects

This module extends the standard library’s functools. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since could be different in Python 2.7 and Python 3.3.

We added the following features.

xotl.tools.future.inspect.get_attr_value(obj, name, *default)[source]

Get a named attribute from an object in a safe way.

Similar to getattr but without triggering dynamic look-up via the descriptor protocol, __getattr__ or __getattribute__ by using getattr_static().

We have backported several Python 3.3 features but maybe not all (some protected structures are not presented in this documentation).

xotl.tools.future.inspect.getfullargspec(func)[source]

Get the names and default values of a callable object’s parameters.

A tuple of seven things is returned: (args, varargs, varkw, defaults, kwonlyargs, kwonlydefaults, annotations). ‘args’ is a list of the parameter names. ‘varargs’ and ‘varkw’ are the names of the * and ** parameters or None. ‘defaults’ is an n-tuple of the default values of the last n parameters. ‘kwonlyargs’ is a list of keyword-only parameter names. ‘kwonlydefaults’ is a dictionary mapping names from kwonlyargs to defaults. ‘annotations’ is a dictionary mapping parameter names to annotations.

Notable differences from inspect.signature():
  • the “self” parameter is always reported, even for bound methods
  • wrapper chains defined by __wrapped__ not unwrapped automatically
xotl.tools.future.inspect.getattr_static(obj, attr, default=<object object>)[source]

Retrieve attributes without triggering dynamic lookup via the descriptor protocol, __getattr__ or __getattribute__.

Note: this function may not be able to retrieve all attributes that getattr can fetch (like dynamically created attributes) and may find attributes that getattr can’t (like descriptors that raise AttributeError). It can also return descriptor objects instead of instance members in some cases. See the documentation for details.

xotl.tools.future.itertools - Functions creating iterators for efficient looping
merge(*iterables, key=None)

Merge the iterables in order.

Return an iterator that yields all items from iterables following the order given by key. If key is not given we compare the items.

If the iterables yield their items in increasing order (w.r.t key), the result is also ordered (like a merge sort).

merge() returns the empty iterator.

New in version 1.8.4.

Changed in version 2.1.0: Based on heapq.merge(). In Python 3.5+, this is just an alias of it.

Deprecated since version 2.1.0: Use heapq.merge() directly. This function will be removed when we support for Python 3.4.

xotl.tools.iterators.zip([iter1[, iter2[, ...]]])

Return a zip-like object whose next() method returns a tuple where the i-th element comes from the i-th iterable argument. The next() method continues until the shortest iterable in the argument sequence is exhausted and then it raises StopIteration.

Deprecated since version 2.1.0: Use the builtin zip(). This function will be removed in xotl.tools 3.

xotl.tools.iterators.map(func, *iterables)

Make an iterator that computes the function using arguments from each of the iterables. It stops when the shortest iterable is exhausted instead of filling in None for shorter iterables.

Deprecated since version 2.1.0: Use the builtin map(). This function will be removed in xotl.tools 3.

xotl.tools.iterators.zip_longest(*iterables, fillvalue=None)

Make an iterator that aggregates elements from each of the iterables. If the iterables are of uneven length, missing values are filled-in with fillvalue. Iteration continues until the longest iterable is exhausted.

If one of the iterables is potentially infinite, then the zip_longest() function should be wrapped with something that limits the number of calls (for example islice() or takewhile()). If not specified, fillvalue defaults to None.

This function is actually an alias to itertools.izip_longest() in Python 2.7, and an alias to itertools.zip_longest() in Python 3.3.

xotl.tools.future.json - Encode and decode the JSON format

This module extends the standard library’s json. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since could be different in Python 2.7 and Python 3.3.

We added the following features.

class xotl.tools.future.json.JSONEncoder(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)[source]

Extensible JSON <http://json.org> encoder for Python data structures.

Supports the following objects and types by default:

Python JSON
dict object
list, tuple array
str string
int, float number
True true
False false
None null

To extend this to recognize other objects, subclass and implement a .default() method with another method that returns a serializable object for o if possible, otherwise it should call the superclass implementation (to raise TypeError).

xotl.tools extends this class by supporting the following data-types (see default() method):

  • datetime, date and time values, which are translated to strings using ISO format.
  • Decimal values, which are represented as a string representation.
  • Iterables, which are represented as lists.
xotl.tools.future.json.encode_string(string, ensure_ascii=True)[source]

Return a JSON representation of a Python string.

Parameters:ensure_ascii – If True, the output is guaranteed to be of type str with all incoming non-ASCII characters escaped. If False, the output can contain non-ASCII characters.
xotl.tools.future.mimetypes – Map filenames to MIME types

Extensions to standard library mimetype.

This module reexport all functions the current version of Python.

New in version 1.8.4.

xotl.tools.future.mimetypes.guess_type(url, strict=True, default=(None, None))[source]

Guess the type of a file based on its filename or URL, given by url.

This is the same as mimetypes.guess_type() with the addition of the default keyword

xotl.tools.future.pprint - Extension to the data pretty printer

This modules includes all the Python’s standard library features in module pprint and adds the function ppformat(), which just returns a string of the pretty-formatted object.

New in version 1.4.1.

xotl.tools.future.pprint.ppformat(obj)[source]

Just like pprint() but always returning a result.

Returns:The pretty formated text.
Return type:unicode in Python 2, str in Python 3.
xotl.tools.future.subprocess - Extensions to subprocess stardard module

New in version 1.2.1.

This module contains extensions to the subprocess standard library module. It may be used as a replacement of the standard.

xotl.tools.future.subprocess.call_and_check_output(args, *, stdin=None, shell=False)[source]

This function combines the result of both call and check_output (from the standard library module).

Returns a tuple (retcode, output, err_output).

xotl.tools.future.textwrap - Text wrapping and filling

This module extends the standard library’s textwrap. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since could be different in Python 2.7 and Python 3.3.

We added the following features.

xotl.tools.future.textwrap.dedent(text, skip_firstline=False)[source]

Remove any common leading whitespace from every line in text.

This can be used to make triple-quoted strings line up with the left edge of the display, while still presenting them in the source code in indented form.

Note that tabs and spaces are both treated as whitespace, but they are not equal: the lines "    hello" and "\thello" are considered to have no common leading whitespace.

If skip_firstline is True, the first line is separated from the rest of the body. This helps with docstrings that follow PEP 257.

Warning

The skip_firstline argument is missing in standard library.

We have backported several Python 3.3 features but maybe not all.

xotl.tools.future.textwrap.indent(text, prefix, predicate=None)[source]

Adds ‘prefix’ to the beginning of selected lines in ‘text’.

If ‘predicate’ is provided, ‘prefix’ will only be added to the lines where ‘predicate(line)’ is True. If ‘predicate’ is not provided, it will default to adding ‘prefix’ to all non-empty lines that do not consist solely of whitespace characters.

xotl.tools.future.threading - Higher-level threading interface

This module extends the standard library’s threading. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since could be different in Python 2.7 and Python 3.3.

We added the following features.

xotl.tools.future.threading.async_call(func, args=None, kwargs=None, callback=None, onerror=None)[source]

Executes a function asynchronously.

The function receives the given positional and keyword arguments

If callback is provided, it is called with a single positional argument: the result of calling func(*args, **kwargs).

If the called function ends with an exception and onerror is provided, it is called with the exception object.

Returns:An event object that gets signalled when the function ends its execution whether normally or with an error.
Return type:Event
xotl.tools.future.threading.sync_call(funcs, callback, timeout=None)[source]

Calls several functions, each one in it’s own thread.

Waits for all to end.

Each time a function ends the callback is called (wrapped in a lock to avoid race conditions) with the result of the as a single positional argument.

If timeout is not None it sould be a float number indicading the seconds to wait before aborting. Functions that terminated before the timeout will have called callback, but those that are still working will be ignored.

Todo

Abort the execution of a thread.

Parameters:funcs – A sequences of callables that receive no arguments.
xotl.tools.future.types - Names for built-in types and extensions

This module extends the standard library’s functools. You may use it as a drop-in replacement in many cases.

Avoid importing * from this module since could be different in Python 2.7 and Python 3.3.

We added mainly compatibility type definitions, those that each one could be in one version and not in other.

class xotl.tools.future.types.MappingProxyType

New in version 1.5.5.

Read-only proxy of a mapping. It provides a dynamic view on the mapping’s entries, which means that when the mapping changes, the view reflects these changes.

Note

In Python 3.3+ this is an alias for types.MappingProxyType in the standard library.

class xotl.tools.future.types.SimpleNamespace

New in version 1.5.5.

A simple object subclass that provides attribute access to its namespace, as well as a meaningful repr.

Unlike object, with SimpleNamespace you can add and remove attributes. If a SimpleNamespace object is initialized with keyword arguments, those are directly added to the underlying namespace.

The type is roughly equivalent to the following code:

class SimpleNamespace(object):
    def __init__(self, **kwargs):
        self.__dict__.update(kwargs)
    def __repr__(self):
        keys = sorted(self.__dict__)
        items = ("{}={!r}".format(k, self.__dict__[k]) for k in keys)
        return "{}({})".format(type(self).__name__, ", ".join(items))
    def __eq__(self, other):
        return self.__dict__ == other.__dict__

SimpleNamespace may be useful as a replacement for class NS: pass. However, for a structured record type use namedtuple() instead.

Note

In Python 3.4+ this is an alias to types.SimpleNamespace.

xotl.tools.infinity - An infinite value

xotl.tools.infinity.Infinity

The positive infinite value. The negative infinite value is -Infinity.

These values are only sensible for comparison. Arithmetic is not supported.

The type of values that is comparable with Infinity is controlled by the ABC InfinityComparable.

class xotl.tools.infinity.InfinityComparable[source]

Any type that can be sensibly compared to infinity.

All types in the number tower are always comparable.

Classes datetime.date, datetime.datetime, and datetime.timedelta are automatically registered.

xotl.tools.keywords – Tools for manage Python keywords as names

Tools for manage Python keywords as names.

Reserved Python keywords can’t be used as attribute names, so this module functions use the convention of rename the name using an underscore as suffix when a reserved keyword is used as name.

xotl.tools.keywords.delkwd(obj, name)[source]

Like delattr but taking into account Python keywords.

xotl.tools.keywords.getkwd(obj, name, default=None)[source]

Like getattr but taking into account Python keywords.

xotl.tools.keywords.kwd_deleter(obj)[source]

partial(delkwd, obj)

xotl.tools.keywords.kwd_getter(obj)[source]

partial(getkwd, obj)

xotl.tools.keywords.kwd_setter(obj)[source]

partial(setkwd, obj)

xotl.tools.keywords.org_kwd(name)[source]

Remove the underscore suffix if name starts with a Python keyword.

xotl.tools.keywords.setkwd(obj, name, value)[source]

Like setattr but taking into account Python keywords.

xotl.tools.keywords.suffix_kwd(name)[source]

Add an underscore suffix if name if a Python keyword.

xotl.tools.modules – Utilities for working with modules

Modules utilities.

xotl.tools.modules.copy_members(source=None, target=None)[source]

Copy module members from source to target.

It’s common in xotl.tools package to extend Python modules with the same name, for example xotl.tools.datetime has all public members of Python’s datetime. copy_members() can be used to copy all members from the original module to the extended one.

Parameters:
  • source

    string with source module name or module itself.

    If not given, is assumed as the last module part name of target.

  • target

    string with target module name or module itself.

    If not given, target name is looked in the stack of caller module.

Returns:

Source module.

Return type:

ModuleType

Warning

Implementation detail

Function used to inspect the stack is not guaranteed to exist in all implementations of Python.

xotl.tools.modules.customize(module, custom_attrs=None, meta=None)[source]

Replaces a module by a custom one.

Injects all kwargs into the newly created module’s class. This allows to have module into which we may have properties or other type of descriptors.

Parameters:
  • module – The module object to customize.
  • custom_attrs

    A dictionary of custom attributes that should be injected in the customized module.

    New in version 1.4.2: Changes the API, no longer uses the **kwargs idiom for custom attributes.

  • meta – The metaclass of the module type. This should be a subclass of type. We will actually subclass this metaclass to properly inject custom_attrs in our own internal metaclass.
Returns:

A tuple of (module, customized, class) with the module in the first place, customized will be True only if the module was created (i.e customize() is idempotent), and the third item will be the class of the module (the first item).

xotl.tools.modules.force_module(ref=None)[source]

Load a module from a string or return module if already created.

If ref is not specified (or integer) calling module is assumed looking in the stack.

Note

Implementation detail

Function used to inspect the stack is not guaranteed to exist in all implementations of Python.

xotl.tools.modules.get_module_path(module)[source]

Gets the absolute path of a module.

Parameters:module – Either module object or a (dotted) string for the module.
Returns:The path of the module.

If the module is a package, returns the directory path (not the path to the __init__).

If module is a string and it’s not absolute, raises a TypeError.

xotl.tools.modules.modulemethod(func)[source]

Decorator that defines a module-level method.

Simply a module-level method, will always receive a first argument self with the module object.

xotl.tools.modules.moduleproperty(getter, setter=None, deleter=None, doc=None, base=<class 'property'>)[source]

Decorator that creates a module-level property.

The module of the getter is replaced by a custom implementation of the module, and the property is injected to the custom module’s class.

The parameter base serves the purpose of changing the base for the property. For instance, this allows you to have memoized_properties at the module-level:

def memoized(self):
    return self
memoized = moduleproperty(memoized, base=memoized_property)

New in version 1.6.1: Added the base parameter.

xotl.tools.names – Utilities for handling objects names

A protocol to obtain or manage object names.

xotl.tools.names.nameof(*objects, depth=1, inner=False, typed=False, full=False, safe=False)[source]

Obtain the name of each one of a set of objects.

New in version 1.4.0.

    Changed in version 1.6.0:
  • Keyword arguments are now keyword-only arguments.

  • Support for several objects

  • Improved the semantics of parameter full.

  • Added the safe keyword argument.

If no object is given, None is returned; if only one object is given, a single string is returned; otherwise a list of strings is returned.

The name of an object is normally the variable name in the calling stack.

If the object is not present calling frame, up to five frame levels are searched. Use the depth keyword argument to specify a different starting point and the search will proceed five levels from this frame up.

If the same object has several good names a single one is arbitrarily chosen.

Good names candidates are retrieved based on the keywords arguments full, inner, safe and typed.

If typed is True and the object is not a type name or a callable (see xotl.tools.future.inspect.safe_name()), then the type of the object is used instead.

If inner is True we try to extract the name by introspection instead of looking for the object in the frame stack.

If full is True the full identifier of the object is preferred. In this case if inner is False the local-name for the object is found. If inner is True, find the import-name.

If safe is True, returned value is converted -if it is not- into a valid Python identifier, though you should not trust this identifier resolves to the value.

See the examples in the documentation.

Warning

The names of objects imported from ‘xoutil’ are still in the namespace ‘xotl.tools’.

xotl.tools.names.identifier_from(obj)[source]

Build an valid identifier from the name extracted from an object.

New in version 1.5.6.

First, check if argument is a type and then returns the name of the type prefixed with _ if valid; otherwise calls nameof function repeatedly until a valid identifier is found using the following order logic: inner=True, without arguments looking-up a variable in the calling stack, and typed=True. Returns None if no valid value is found.

Examples:

>>> identifier_from({})
'dict'

Use cases for getting the name of an object

The function nameof() is useful for cases when you get a value and you need a name. This is a common need when doing framework-level code that tries to avoid repetition of concepts.

Solutions with nameof()
Properly calculate the tasks’ name in Celery applications

Celery warns about how to import the tasks. If in a module you import your task using an absolute import, and in another module you import it using a relative import, Celery regards them as different tasks. You must either use a consistent import style, or give a name for the task. Using nameof you can easily fix this problem.

Assume you create a celapp.tasks.basic module with this code:

>>> def celery_task(celeryapp, *args, **kwargs):
...    def decorator(func):
...        from xotl.tools.names import nameof
...        taskname = nameof(func, full=True, inner=True)
...        return celeryapp.task(name=taskname, *args, **kwargs)(func)
...    return decorator

>>> from celery import Celery
>>> app = Celery()
>>> @celery_task(app)
... def add(x, y):
...     return x + y

Then importing the task directly in a shell will have the correct name:

>>> from celapp.tasks.basic import add
>>> add.name
'celapp.tasks.basic.add'

Another module that imports the task will also see the proper name. Say you have the module celapp.consumer:

>>> from .tasks import basic

>>> def get_name(taskname):
...     task = getattr(basic, taskname)
...     return task.name

Then:

>>> from celapp.consumer import get_name
>>> get_name('add')
'celapp.tasks.basic.add'

Despite that you imported the basic module with a relative import the name is fully calculated.

xotl.tools.objects - Functions for dealing with objects

Several utilities for objects in general.

xotl.tools.objects.validate_attrs(source, target, force_equals=(), force_differents=())[source]

Makes a ‘comparison’ of source and target by its attributes (or keys).

This function returns True if and only if both of these tests pass:

  • All attributes in force_equals are equal in source and target
  • All attributes in force_differents are different in source and target

For instance:

>>> class Person:
...    def __init__(self, **kwargs):
...        for which in kwargs:
...            setattr(self, which, kwargs[which])

>>> source = Person(name='Manuel', age=33, sex='male')
>>> target = {'name': 'Manuel', 'age': 4, 'sex': 'male'}

>>> validate_attrs(source, target, force_equals=('sex',),
...                force_differents=('age',))
True

>>> validate_attrs(source, target, force_equals=('age',))
False

If both force_equals and force_differents are empty it will return True:

>>> validate_attrs(source, target)
True
xotl.tools.objects.iterate_over(source, *keys)[source]

Yields pairs of (key, value) for of all keys in source.

If any key is missing from source is ignored (not yielded).

If source is a collection, iterate over each of the items searching for any of keys. This is not recursive.

If no keys are provided, return an “empty” iterator – i.e will raise StopIteration upon calling next.

New in version 1.5.2.

xotl.tools.objects.smart_getter(obj, strict=False)[source]

Returns a smart getter for obj.

If obj is a mapping, it returns the .get() method bound to the object obj, otherwise it returns a partial of getattr on obj.

Parameters:strict – Set this to True so that the returned getter checks that keys/attrs exists. If strict is True the getter may raise a KeyError or an AttributeError.

Changed in version 1.5.3: Added the parameter strict.

xotl.tools.objects.smart_getter_and_deleter(obj)[source]

Returns a function that get and deletes either a key or an attribute of obj depending on the type of obj.

If obj is a collections.Mapping it must be a collections.MutableMapping.

xotl.tools.objects.popattr(obj, name, default=None)[source]

Looks for an attribute in the obj and returns its value and removes the attribute. If the attribute is not found, default is returned instead.

Examples:

>>> class Foo:
...   a = 1
>>> foo = Foo()
>>> foo.a = 2
>>> popattr(foo, 'a')
2
>>> popattr(foo, 'a')
1
>>> popattr(foo, 'a') is None
True
xotl.tools.objects.setdefaultattr(obj, name, value)[source]

Sets the attribute name to value if it is not set:

>>> class Someclass: pass
>>> inst = Someclass()
>>> setdefaultattr(inst, 'foo', 'bar')
'bar'

>>> inst.foo
'bar'

>>> inst.spam = 'egg'
>>> setdefaultattr(inst, 'spam', 'with ham')
'egg'

(New in version 1.2.1). If you want the value to be lazily evaluated you may provide a lazy-lambda:

>>> inst = Someclass()
>>> inst.a = 1
>>> def setting_a():
...    print('Evaluating!')
...    return 'a'

>>> setdefaultattr(inst, 'a', lazy(setting_a))
1

>>> setdefaultattr(inst, 'ab', lazy(setting_a))
Evaluating!
'a'
xotl.tools.objects.copy_class(cls, meta=None, ignores=None, new_attrs=None, new_name=None)[source]

Copies a class definition to a new class.

The returned class will have the same name, bases and module of cls.

Parameters:
  • meta – If None, the type(cls) of the class is used to build the new class, otherwise this must be a proper metaclass.
  • ignores

    A sequence of attributes names that should not be copied to the new class.

    An item may be callable accepting a single argument attr that must return a non-null value if the the attr should be ignored.

  • new_attrs (dict) – New attributes the class must have. These will take precedence over the attributes in the original class.
  • new_name – The name for the copy. If not provided the name will copied.

New in version 1.4.0.

Changed in version 1.7.1: The ignores argument must an iterable of strings or callables. Removed the glob-pattern and regular expressions as possible values. They are all possible via the callable variant.

New in version 1.7.1: The new_name argument.

xotl.tools.objects.fulldir(obj)[source]

Return a set with all attribute names defined in obj

class xotl.tools.objects.classproperty[source]

A descriptor that behaves like property for instances but for classes.

Example of its use:

class Foobar:
    @classproperty
    def getx(cls):
        return cls._x

A writable classproperty is difficult to define, and it’s not intended for that case because ‘setter’, and ‘deleter’ decorators can’t be used for obvious reasons. For example:

class Foobar:
    x = 1
    def __init__(self, x=2):
        self.x = x
    def _get_name(cls):
        return str(cls.x)
    def _set_name(cls, x):
        cls.x = int(x)
    name = classproperty(_get_name, _set_name)

New in version 1.4.1.

Changed in version 1.8.0: Inherits from property

xotl.tools.objects.import_object(name, package=None, sep='.', default=None, **kwargs)[source]

Get symbol by qualified name.

The name should be the full dot-separated path to the class:

modulename.ClassName

Example:

celery.concurrency.processes.TaskPool
                            ^- class name

or using ‘:’ to separate module and symbol:

celery.concurrency.processes:TaskPool

Examples:

>>> import_object('celery.concurrency.processes.TaskPool')
<class 'celery.concurrency.processes.TaskPool'>

# Does not try to look up non-string names.
>>> from celery.concurrency.processes import TaskPool
>>> import_object(TaskPool) is TaskPool
True
xotl.tools.objects.get_first_of(sources, *keys, default=None, pred=None)[source]

Return the value of the first occurrence of any of the specified keys in source that matches pred (if given).

Both source and keys has the same meaning as in iterate_over().

Parameters:
  • default – A value to be returned if no key is found in source.
  • pred – A function that should receive a single value and return False if the value is not acceptable, and thus get_first_of should look for another.

Changed in version 1.5.2: Added the pred option.

xotl.tools.objects.xdir(obj, filter=None, attr_filter=None, value_filter=None, getattr=None)[source]

Return all (attr, value) pairs from obj make filter(attr, value) True.

Parameters:
  • obj – The object to be instrospected.
  • filter

    A filter that will be passed both the attribute name and it’s value as two positional arguments. It should return True for attrs that should be yielded.

    If None, all pairs will match.

  • getter – A function with the same signature that getattr to be used to get the values from obj. If None, use getattr().

Changed in version 1.8.1: Removed deprecated attr_filter and value_filter arguments.

xotl.tools.objects.fdir(obj, filter=None, attr_filter=None, value_filter=None, getattr=None)[source]

Similar to xdir() but yields only the attributes names.

xotl.tools.objects.smart_copy(*sources, target, *, defaults=False)[source]

Copies the first apparition of attributes (or keys) from sources to target.

Parameters:
  • sources – The objects from which to extract keys or attributes.
  • target – The object to fill.
  • defaults (Either a bool, a dictionary, an iterable or a callable.) – Default values for the attributes to be copied as explained below. Defaults to False.

Every sources and target are always positional arguments. There should be at least one source. target will always be the last positional argument.

If defaults is a dictionary or an iterable then only the names provided by itering over defaults will be copied. If defaults is a dictionary, and one of its key is not found in any of the sources, then the value of the key in the dictionary is copied to target unless:

  • It’s the value Undefined.
  • An exception object
  • A sequence with is first value being a subclass of Exception. In which case adapt_exception is used.

In these cases a KeyError is raised if the key is not found in the sources.

If defaults is an iterable and a key is not found in any of the sources, None is copied to target.

If defaults is a callable then it should receive one positional arguments for the current attribute name and several keyword arguments (we pass source) and return either True or False if the attribute should be copied.

If defaults is False (or None) only the attributes that do not start with a “_” are copied, if it’s True all attributes are copied.

When target is not a mapping only valid Python identifiers will be copied.

Each source is considered a mapping if it’s an instance of collections.Mapping or a MappingProxyType.

The target is considered a mapping if it’s an instance of collections.MutableMapping.

Returns:target.

Changed in version 1.7.0: defaults is now keyword only.

xotl.tools.objects.extract_attrs(obj, *names, default=Unset)[source]

Extracts all names from an object.

If obj is a Mapping, the names will be search in the keys of the obj; otherwise the names are considered regular attribute names.

If default is Unset and any name is not found, an AttributeError is raised, otherwise the default is used instead.

Returns a tuple if there are more that one name, otherwise returns a single value.

New in version 1.4.0.

Changed in version 1.5.3: Each name may be a path like in get_traverser(), but only “.” is allowed as separator.

xotl.tools.objects.traverse(obj, path, default=Unset, sep='.', getter=None)[source]

Traverses an object’s hierarchy by performing an attribute get at each level.

This helps getting an attribute that is buried down several levels deep. For example:

traverse(request, 'session.somevalue')

If default is not provided (i.e is Unset) and any component in the path is not found an AttributeError exceptions is raised.

You may provide sep to change the default separator.

You may provide a custom getter. By default, does an smart_getter() over the objects. If provided getter should have the signature of getattr().

See get_traverser() if you need to apply the same path(s) to several objects. Actually this is equivalent to:

get_traverser(path, default=default, sep=sep, getter=getter)(obj)
xotl.tools.objects.get_traverser(*paths, default=Unset, sep='.', getter=None)[source]

Combines the power of traverse() with the expectations from both operator.itemgetter() and operator.attrgetter().

Parameters:paths – Several paths to extract.

Keyword arguments has the same meaning as in traverse().

Returns:A function the when invoked with an object traverse the object finding each path.

New in version 1.5.3.

xotl.tools.objects.dict_merge(*dicts, **other)[source]

Merges several dicts into a single one.

Merging is similar to updating a dict, but if values are non-scalars they are also merged is this way:

  • Any two sequences or sets are joined together.
  • Any two mappings are recursively merged.
  • Other types are just replaced like in update().

If for a single key two values of incompatible types are found, raise a TypeError. If the values for a single key are compatible but different (i.e a list an a tuple) the resultant type will be the type of the first apparition of the key, unless for mappings which are always cast to dicts.

No matter the types of dicts the result is always a dict.

Without arguments, return the empty dict.

xotl.tools.objects.pop_first_of(source, *keys, default=None)[source]

Similar to get_first_of() using as source either an object or a mapping and deleting the first attribute or key.

Examples:

>>> somedict = dict(bar='bar-dict', eggs='eggs-dict')

>>> class Foo: pass
>>> foo = Foo()
>>> foo.bar = 'bar-obj'
>>> foo.eggs = 'eggs-obj'

>>> pop_first_of((somedict, foo), 'eggs')
'eggs-dict'

>>> pop_first_of((somedict, foo), 'eggs')
'eggs-obj'

>>> pop_first_of((somedict, foo), 'eggs') is None
True

>>> pop_first_of((foo, somedict), 'bar')
'bar-obj'

>>> pop_first_of((foo, somedict), 'bar')
'bar-dict'

>>> pop_first_of((foo, somedict), 'bar') is None
True
xotl.tools.objects.fix_method_documentation(cls, method_name, ignore=None, min_length=10, deep=1, default=None)[source]

Fix the documentation for the given class using its super-classes.

This function may be useful for shells or Python Command Line Interfaces (CLI).

If cls has an invalid documentation, super-classes are recursed in MRO until a documentation definition was made at any level.

Parameters:
  • ignore – could be used to specify which classes to ignore by specifying its name in this list.
  • min_length – specify that documentations with less that a number of characters, also are ignored.
xotl.tools.objects.multi_getter(source, *ids)[source]

Get values from source of all given ids.

Parameters:
  • source – Any object but dealing with differences between mappings and other object types.
  • ids

    Identifiers to get values from source.

    An ID item could be:

    • a string: is considered a key, if source is a mapping, or an attribute name if source is an instance of any other type.
    • a collection of strings: find the first valid value in source evaluating each item in this collection using the above logic.

Example:

>>> d = {'x': 1, 'y': 2, 'z': 3}
>>> list(multi_getter(d, 'a', ('y', 'x'), ('x', 'y'), ('a', 'z', 'x')))
[None, 2, 1, 3]

>>> next(multi_getter(d, ('y', 'x'), ('x', 'y')), '---')
2

>>> next(multi_getter(d, 'a', ('b', 'c'), ('e', 'f')), '---') is None
True

New in version 1.7.1.

xotl.tools.objects.get_branch_subclasses(cls, *, include_this=False)[source]

Similar to type.__subclasses__() but recursive.

Only return sub-classes in branches (those with no sub-classes). Instead of returning a list, yield each valid value.

New in version 1.7.0.

Changed in version 2.1.5: Add keyword-only argument include_this.

xotl.tools.objects.iter_final_subclasses(cls, *, include_this=True)[source]

Iterate over the final sub-classes of cls.

Final classes are those which has no sub-classes. If cls is final, the iterator yields only cls unless include_this is False.

New in version 2.1.0.

Deprecated since version 2.1.5: This is actually a duplicate of iter_branch_subclasses().

xotl.tools.objects.get_final_subclasses(cls, *, include_this=True)[source]

List final sub-classes of cls.

See iter_final_subclasses().

New in version 2.1.0.

Deprecated since version 2.1.5: This is a duplicate of get_branch_subclasses().

xotl.tools.objects.FinalSubclassEnumeration(superclass, *, dynamic=True)[source]

A final sub-class enumeration.

Return a enumeration-like class (i.e has __members__ and each attribute) that enumerates the final subclasses of a given superclass (not including superclass).

If dynamic is True, don’t cache the subclasses; i.e if a new subclass is created after the enumeration, the __members__ dictionary will change.

The resulting enumeration class has a method invalidate_cache() which allows non-dynamic classes to update its underlying cache.

New in version 2.1.0.

xotl.tools.objects.save_attributes(obj, *attributes, getter=None, setter=None)[source]

A context manager that restores obj attributes at exit.

We deal with obj’s attributes with smart_getter() and smart_setter(). You can override passing keyword getter and setter. They must take the object and return a callable to get/set the its attributes.

Basic example:

>>> from xotl.tools.future.types import SimpleNamespace as new
>>> obj = new(a=1, b=2)
>>> with save_attributes(obj, 'a'):
...    obj.a = 2
...    obj.b = 3
>>> obj.a
1
>>> obj.b
3

Depending on the behavior of getter and or the object itself, it may be an error to get an attribute or key that does not exists.

>>> getter = lambda o: lambda a: getattr(o, a)
>>> with save_attributes(obj, 'c', getter=getter):   # doctest: +ELLIPSIS
...    pass
Traceback (...)
...
AttributeError: ...

Beware, however, that smart_getter() is non-strict by default and it returns None for a non-existing key or attribute. In this case, we attempt to set that attribute or key at exit:

>>> with save_attributes(obj, 'x'):
...   pass
>>> obj.x is None
True

But, then, setting the value may fail:

>>> obj = object()
>>> with save_attribute(obj, 'x'):  # doctest: +ELLIPSIS
...   pass
Traceback (...)
...
AttributeError: ...

New in version 1.8.2.

xotl.tools.objects.temp_attributes(obj, attrs, getter=None, setter=None)[source]

A context manager that temporarily sets attributes.

attrs is a dictionary containing the attributes to set.

Keyword arguments getter and setter have the same meaning as in save_attributes(). We also use the setter to set the values provided in attrs.

New in version 1.8.5.

class xotl.tools.objects.memoized_property(fget, doc=None)[source]

A read-only property that is only evaluated once.

This is extracted from the SQLAlchemy project’s codebase, merit and copyright goes to SQLAlchemy authors:

Copyright (C) 2005-2011 the SQLAlchemy authors and contributors

This module is part of SQLAlchemy and is released under the MIT License:
http://www.opensource.org/licenses/mit-license.php

New in version 1.8.1: Ported from xoutil.decorator.memoized_property.

reset(instance)[source]

Clear the cached value of instance.

xotl.tools.objects.delegator(attribute, attrs_map, metaclass=<class 'type'>)[source]

Create a base class that delegates attributes to another object.

The returned base class contains a delegated attribute descriptor for each key in attrs_map.

Parameters:
  • attribute – The attribute of the delegating object that holds the delegated attributes.
  • attrs_map – A map of attributes to delegate. The keys are the attribute names the delegating object attributes, and the values the attribute names of the delegated object.

Example:

>>> class Bar:
...     x = 'bar'
>>> class Foo(delegator('egg', {'x1': 'x'})):
...     def __init__(self):
...         self.egg = Bar()
>>> foo = Foo()
>>> foo.x1
'bar'

New in version 1.9.3.

class xotl.tools.objects.DelegatedAttribute(target_name, delegated_attr, default=Unset)[source]

A delegator data descriptor.

When accessed the descriptor finds the delegated_attr in the instance’s value given by attribute target_name.

If the instance has no attribute with name target_name, raise an AttributeError.

If the target object does not have an attribute with name delegate_attr and default is Unset, raise an AttributeError. If default is not Unset, return default.

New in version 1.9.3.

xotl.tools.params – Tools for managing function arguments

Tools for managing function arguments.

Process function arguments could be messy when a flexible schema is needed. With this module you can outline parameters schema using a smart way of processing actual arguments:

A parameter row (see ParamSchemeRow), allow several keywords IDs (one is required used as the final identifier for the actual argument). Also integer IDs expressing logical order for positional argument passing (negative values are for right-to-left indexing, like in sequences). Several values means several possibilities.

New in version 1.8.0.

Examples

In next example, the parameter key-named “stream” could be also passed as name “output”, must be a file, default value is stdout, and if passed as positional, could be the first or the last one.

>>> import sys
>>> from xotl.tools.values import file_coerce as is_file
>>> from xotl.tools.values import positive_int_coerce as positive_int
>>> from xotl.tools.params import ParamScheme as scheme, ParamSchemeRow as row
>>> sample_scheme = scheme(
...     row('stream', 0, -1, 'output', default=sys.stdout, coerce=is_file),
...     row('indent', 0, 1, default=1, coerce=positive_int),
...     row('width', 0, 1, 2, 'max_width', default=79, coerce=positive_int),
...     row('newline', default='\n', coerce=(str, )))

Some tests:

>>> def test(*args, **kwargs):
...     return sample_scheme(args, kwargs)

>>> test(4, 80)
{'indent': 4,
 'newline': '\n',
 'stream': <open file '<stdout>', mode 'w' at 0x7f927b32b150>,
 'width': 80}

>>> test(2, '80')    # Because positive int coercer use valid string values
{'indent': 2,
 'newline': '\n',
 'stream': <open file '<stdout>', mode 'w' at 0x7f927b32b150>,
 'width': 80}

>>> test(sys.stderr, 4, 80)
{'indent': 4,
 'newline': '\n',
 'stream': <open file '<stderr>', mode 'w' at 0x7f927b32b1e0>,
 'width': 80}

>>> test(4, sys.stderr, newline='\n\r')
{'indent': 4,
 'newline': '\n\r',
 'stream': <open file '<stderr>', mode 'w' at 0x7f927b32b1e0>,
 'width': 79}

>>> sample_scheme((4, 80), {'extra': 'extra param'}, strict=False)
{'extra': 'extra param',
 'indent': 4,
 'newline': '\n',
 'stream': <open file '<stdout>', mode 'w' at 0x7f3c6815c150>,
 'width': 80}

Another way of use this is through a ParamManager instance, using the actual arguments of a function to create it:

>>> def slugify(value, *args, **kwds):
...     from xotl.tools.params import ParamManager
...     getarg = ParamManager(args, kwds)
...     replacement = getarg('replacement', 0, default='-',
...                          coercers=(str, ))
...     invalid_chars = getarg('invalid_chars', 'invalid', 'invalids', 0,
...                            default='', coercers=_ascii)
...     valid_chars = getarg('valid_chars', 'valid', 'valids', 0,
...                          default='', coercers=_ascii)
...     # And so on.

Notice that each call has the same protocol than a parameter definition row (see ParamSchemeRow).

Module Members

xotl.tools.params.issue_9137(args)[source]

Parse arguments for methods, fixing issue 9137 (self ambiguity).

There are methods that expect ‘self’ as valid keyword argument, this is not possible if this name is used explicitly:

def update(self, *args, **kwds):
    ...

To solve this, declare the arguments as method_name(*args, **kwds), and in the function code:

self, args = issue_9137(args)
Returns:(self, remainder positional arguments in a tuple)

New in version 1.8.0.

xotl.tools.params.check_count(args, low, high=1048576, caller=None)[source]

Check the positional arguments actual count against constrains.

Parameters:
  • args – The args to check count, normally is a tuple, but an integer is directly accepted.
  • low – Integer expressing the minimum count allowed.
  • high – Integer expressing the maximum count allowed.
  • caller – Name of the function issuing the check, its value is used only for error reporting.

New in version 1.8.0.

xotl.tools.params.check_default(absent=Undefined)[source]

Get a default value passed as a last excess positional argument.

Parameters:absent – The value to be used by default if no one is given. Defaults to Undefined.

For example:

def get(self, name, *default):
    from xotl.tools.params import check_default, Undefined
    if name in self.inner_data:
        return self.inner_data[name]
    elif check_default()(*default) is not Undefined:
        return default[0]
    else:
        raise KeyError(name)

New in version 1.8.0.

xotl.tools.params.single(args, kwds)[source]

Return a true value only when a unique argument is given.

When needed, the most suitable result will be wrapped using the Maybe.

New in version 1.8.0.

xotl.tools.params.pop_keyword_arg(kwargs, names, default=Undefined)[source]

Return the value of a keyword argument.

Parameters:
  • kwargs – The mapping with passed keyword arguments.
  • names – Could be a single name, or a collection of names.
  • default – The default value to return if no value is found.

New in version 1.8.0.

xotl.tools.params.pop_keyword_values(kwargs, *names, **options)[source]

Return a list with all keyword argument values.

Parameters:
  • kwargs – The mapping with passed keyword arguments.
  • names – Each item will be a definition of keyword argument name to retrieve. Could be a string with a name, or a list of alternatives (aliases).
  • default – Keyword only option to define a default value to be used in place of not given arguments. If not given, it is used special value Undefined.
  • defaults

    A dictionary with default values per argument name. If none is given, use default.

    Note

    defaults trumps default.

    Warning

    For the case where a single name has several alternatives, you may choose any of the alternatives. If you pass several diverging defaults for different alternatives, the result is undefined.

  • ignore_error – By default, when there are remaining values in kwargs, after all names are processed, a TypeError is raised. If this keyword only option is True, this function returns normally.

Examples:

>>> pop_keyword_values({'b': 1}, 'a', 'b')
[Undefined, 1]

>>> kwargs = {'a': 1, 'b': 2, 'c': 3}
>>> try:
...     res = pop_keyword_values(kwargs, 'a', 'b')
... except TypeError as error:
...     res = error
>>> type(res)
TypeError

>>> kwargs = {'a': 1, 'b': 2, 'c': 3}
>>> options = dict(ignore_error=True, default=None)
>>> pop_keyword_values(kwargs, 'a', ('B', 'b'), **options)
[1, 2]

New in version 1.8.3.

class xotl.tools.params.ParamManager(args, kwds)[source]

Function parameters parser.

For example:

def wraps(*args, **kwargs):
    pm = ParamManager(args, kwargs)
    name = pm(0, 1, 'name', coerce=str)
    wrapped = pm(0, 1, 'wrapped', coerce=valid(callable))
    ...

When an instance of this class is called (__call__ operator), it is used the same protocol as when creating an instance of a parameter definition row (ParamSchemeRow).

See ParamScheme class as another way to define and validate schemes for extracting parameter values in a consistent way.

New in version 1.8.0.

__call__(*ids, **options)[source]

Get a parameter value.

__init__(args, kwds)[source]

Created with actual parameters of a client function.

remainder()[source]

Return not consumed values in a mapping.

class xotl.tools.params.ParamScheme(*rows)[source]

Full scheme for a ParamManager instance call.

This class receives a set of ParamSchemeRow instances and validate them as a whole.

New in version 1.8.0.

__call__(args, kwds, strict=True)[source]

Get a mapping with all resulting values.

If special value ‘none’ is used as ‘default’ option in a scheme-row, corresponding value isn’t returned in the mapping if the parameter value is missing.

__getitem__(idx)[source]

Obtain the scheme-row by a given index.

__iter__()[source]

Iterate over all defined scheme-rows.

__len__()[source]

The defined scheme-rows number.

defaults

Return a mapping with all valid default values.

class xotl.tools.params.ParamSchemeRow(*ids, **options)[source]

Scheme row for a ParamManager instance call.

This class validates identifiers and options at this level; these checks are not done in a call to get a parameter value.

Normally this class is used as part of a full ParamScheme composition.

Additionally to the options can be passed to ParamManager.__call__()’, this class can be instanced with:

Parameters:
  • ids – positional variable number arguments, could be aliases for keyword parameter passing, or integers for order (negative values are means right-to-left indexing, like in sequences);
  • key – an identifier to be used when the parameter is only positional or when none of the possible keyword aliases must be used as the primary-key;
  • default – keyword argument, value used if the parameter is absent;
  • coerce – check if a value is valid or not and convert to its definitive value; see xotl.tools.values module for more information.

New in version 1.8.0.

__call__(*args, **kwds)[source]

Execute a scheme-row using as argument a ParamManager instance.

The concept of ParamManager instance argument is a little tricky: when a variable number of arguments is used, if only one positional and is already an instance of ParamManager, it is directly used; if two, the first is a tuple and the second is a dict, these are considered the constructor arguments of the new instance; otherwise all arguments are used to build the new instance.

default

Returned value if parameter value is absent.

If not defined, special value none is returned.

key

The primary key for this scheme-row definition.

This concept is a little tricky (the first string identifier if some is given, if not then the first integer). This definition is useful, for example, to return remainder not consumed values after a scheme process is completed (see ParamManager.remainder() for more information).

xotl.tools.progress - Console progress utils

Tool to show a progress percent in the terminal.

Deprecated since version 2.1.0.

class xotl.tools.progress.Progress(max_value=100, delta=1, first_message=None, display_width=None)[source]

Print a progress percent to the console. Also the elapsed and the estimated times.

To signal an increment in progress just call the instance and (optionally) pass a message like in:

progress = Progress(10)
for i in range(10):
    progress()

xotl.tools.records - Records definitions

Included reader builders

The following functions build readers for standards types.

Note

You cannot use these functions themselves as readers, but you must call them to obtain the desired reader.

All these functions have a pair of keywords arguments nullable and default. The argument nullable indicates whether the value must be present or not. The function check_nullable() implements this check and allows other to create their own builders with the same semantic.

Checking for null values

These couple of functions allows you to define new builders that use the same null concept. For instance, if you need readers that parse dates in diferent locales you may do:

def date_reader(nullable=False, default=None, locale=None):
    from xotl.tools.records import check_nullable
    from babel.dates import parse_date, LC_TIME
    from datetime import datetime
    if not locale:
        locale = LC_TIME

    def reader(value):
        if check_nullable(value, nullable):
            return parse_date(value, locale=locale)
        else:
            return default
    return reader

xotl.tools.string - Common string operations

normalize_slug(value, replacement='-', invalid_chars='', valid_chars='', encoding=None)

Deprecated alias of slugify().

xotl.tools.symbols – Logical values

Special logical values like Unset, Undefined, Ignored, Invalid, …

All values only could be True or False but are intended in places where None is expected to be a valid value or for special Boolean formats.

xotl.tools.symbols.Ignored = Ignored

To be used in arguments that are currently ignored because they are being deprecated. The only valid reason to use Ignored is to signal ignored arguments in method’s/function’s signature

xotl.tools.symbols.Invalid = Invalid

To be used in functions resulting in a fail where False could be a valid value.

class xotl.tools.symbols.MetaSymbol[source]

Meta-class for symbol types.

nameof(s)[source]

Get the name of a symbol instance (s).

parse(name)[source]

Returns instance from a string.

Standard Python Boolean values are parsed too.

xotl.tools.symbols.This = This

To be used as a mark for current context as a mechanism of comfort.

xotl.tools.symbols.Undefined = Undefined

False value for local scope use or where Unset could be a valid value

xotl.tools.symbols.Unset = Unset

False value, mainly for function parameter definitions, where None could be a valid value.

class xotl.tools.symbols.boolean(*args, **kwds)[source]

Instances are custom logical values (True or False).

Special symbols allowing only logical (False or True) values.

For example:

>>> true = boolean('true', True)
>>> false = boolean('false')
>>> none = boolean('false')
>>> unset = boolean('unset')

>>> class X:
...      attr = None

>>> getattr(X(), 'attr') is not None
False

>>> getattr(X(), 'attr', false) is not false
True

>>> none is false
True

>>> false == False
True

>>> false == unset
True

>>> false is unset
False

>>> true == True
True
class xotl.tools.symbols.symbol(*args, **kwds)[source]

Instances are custom symbols.

Symbol instances identify uniquely a semantic concept by its name. Each one has an ordinal value associated.

For example:

>>> ONE2MANY = symbol('ONE2MANY')
>>> ONE_TO_MANY = symbol('ONE2MANY')

>>> ONE_TO_MANY is ONE2MANY
True

xotl.tools.tasking – Task oriented tools.

class StandardWait

A deprecated alias for ConstantAlias.

xotl.tools.testing – Utilities testing

Provides sample data generators for xotl.tools’s data structures.

Warning

You must install xotl.tools[testing] in order to get extra dependencies.

New in version 1.8.2.

xotl.tools.testing.datetime – Generators for date and datetime

xotl.tools.validators – value validators

Some generic value validators and regular expressions and validation functions for several identifiers.

xotl.tools.validators.check(value, validator, msg=None)[source]

Check a value with a validator.

Argument validator could be a callable, a type, or a tuple of types.

Return True if the value is valid.

Examples:

>>> check(1, int)
True

>>> check(10, lambda x: x <= 100, 'must be less than or equal to 100')
True

>>> check(11/2, (int, float))
True
xotl.tools.validators.check_no_extra_kwargs(kwargs)[source]

Check that no extra keyword arguments are still not processed.

For example:

>>> from xotl.tools.validators import check_no_extra_kwargs
>>> def only_safe_arg(**kwargs):
...     safe = kwargs.pop('safe', False)
...     check_no_extra_kwargs(kwargs)
...     print('OK for safe:', safe)
xotl.tools.validators.is_type(cls)[source]

Return a validator with the same name as the type given as argument value.

Parameters:cls – Class or type or tuple of several types.
xotl.tools.validators.ok(value, *checkers, **kwargs)[source]

Validate a value with several checkers.

Return the value if it is Ok, or raises an ValueError exception if not.

Arguments:

Parameters:
  • value – the value to validate
  • checkers – a variable number of checkers (at least one), each one could be a type, a tuple of types of a callable that receives the value and returns if the value is valid or not. In order the value is considered valid, all checkers must validate the value.
  • message – keyword argument to be used in case of error; will be the argument of ValueError exception; could contain the placeholders {value} and {type}; a default value is used if this argument is not given.
  • msg – an alias for “message”
  • extra_checkers – In order to create validators using partial. Must be a tuple.

Keyword arguments are not validated to be correct.

This function could be used with type-definitions for arguments, see xotl.tools.fp.prove.semantic.TypeCheck.

Examples:

>>> ok(1, int)
1

>>> ok(10, int, lambda x: x < 100, message='Must be integer under 100')
10

>>> ok(11/2, (int, float))
5.5

>>> ok(11/2, int, float)
5.5

>>> try:
...     res = ok(11/2, int)
... except ValueError:
...     res = '---'
>>> res
'---'
xotl.tools.validators.predicate(*checkers, **kwargs)[source]

Return a validation checker for types and simple conditions.

Parameters:
  • checkers

    A variable number of checkers; each one could be:

    • A type, or tuple of types, to test valid values with isinstance(value, checker)
    • A set or mapping of valid values, the value is valid if contained in the checker.
    • A tuple of other inner checkers, if any of the checkers validates a value, the value is valid (OR).
    • A list of other inner checkers, all checkers must validate the value (AND).
    • A callable that receives the value and returns True if the value is valid.
    • True and False could be used as checkers always validating or invalidating the value.

    An empty list or no checker is synonym of True, an empty tuple, set or mapping is synonym of False.

  • name – Keyword argument to be used in case of error; will be the argument of ValueError exception; could contain the placeholders {value} and {type}; a default value is used if this argument is not given.
  • force_name – Keyword argument to force a name if not given.

In order to obtain good documentations, use proper names for functions and lambda arguments.

With this function could be built real type checkers, for example:

>>> is_valid_age = predicate((int, float), lambda age: 0 < age <= 120)
>>> is_valid_age(100)
True

>>> is_valid_age(130)
False

>>> always_true = predicate(True)
>>> always_true(False)
True

>>> always_false = predicate(False)
>>> always_false(True)
False

>>> always_true = predicate()
>>> always_true(1)
True

>>> always_true('any string')
True

>>> always_false = predicate(())
>>> always_false(1)
False

>>> always_false('any string')
False

Contents:

xotl.tools.validators.identifiers – Simple identifiers validators

Regular expressions and validation functions for several identifiers.

xotl.tools.validators.identifiers.is_valid_identifier(name)[source]

Returns True if name a valid Python identifier.

If name is not a string, return False. This is roughly:

isinstance(name, str) and name.isidentifier()
xotl.tools.validators.identifiers.is_valid_full_identifier(name)[source]

Returns True if name is a valid dotted Python identifier.

See is_valid_identifier() for what “validity” means.

xotl.tools.validators.identifiers.is_valid_public_identifier(name)[source]

Returns True if name is a valid Python identifier that is deemed public.

Convention says that any name starting with a “_” is not public.

See is_valid_identifier() for what “validity” means.

xotl.tools.values – coercers (or checkers) for value types

Some generic coercers (or checkers) for value types.

This module coercion function are not related in any way to deprecated old python feature, are similar to a combination of object mold/check:

  • Mold - Fit values to expected conventions.
  • Check - These functions must return nil [1] special value to specify that expected fit is not possible.
[1]We don’t use Python classic NotImplemented special value in order to obtain False if the value is not coerced (nil).

A custom coercer could be created with closures, for an example see create_int_range_coerce().

This module uses Unset value to define absent -not being specified- arguments.

Also contains sub-modules to obtain, convert and check values of common types.

New in version 1.7.0.

class xotl.tools.values.MetaCoercer[source]

Meta-class for coercer.

This meta-class allows that several objects are considered valid instances of coercer:

  • Functions decorated with coercer (used with its decorator facet).
  • Instances of any sub-class of custom.
  • Instances of coercer itself.

See the class declaration (coercer) for more information.

xotl.tools.values.callable_coerce(arg)[source]

Check if arg is a callable object.

class xotl.tools.values.coercer[source]

Special coercer class.

This class has several facets:

  • Pure type-checkers when a type or tuple of types are received as argument. See istype for more information.

  • Return equivalent coercer from some special values:

    • Any true value -> identity_coerce
    • Any false or empty value -> void_coerce
  • A decorator for functions; when a function is given, decorate it to become a coercer. The mark itself is not enough, functions intended to be coercers must fulfills the protocol (not to produce exception and return nil on fails). For example:

    >>> @coercer
    ... def age_coerce(arg):
    ...     res = int_coerce(arg)
    ...     return res if t(res) and 0 < arg <= 120 else nil
    
    # TODO: Change next, don't use isinstance
    >>> isinstance(age_coerce, coercer)
    True
    
xotl.tools.values.coercer_name(arg, join=None)[source]

Get the name of a coercer.

Parameters:
  • arg – Coercer to get the name. Also processes collections (tuple, list, or set) of coercers. Any other value is considered invalid and raises an exception.
  • join

    When a collection is used; if this argument is None a collection of names is returned, if not None then is used to join the items in a resulting string.

    For example:

    >>> coercer_name((int_coerce, float_coerce))
    ('int', 'float')
    
    >>> coercer_name((int_coerce, float_coerce), join='-')
    'int-float'
    

    To obtain pretty-print tuples, use something like:

    >>> coercer_name((int_coerce, float_coerce),
    ...              join=lambda arg: '(%s)' % ', '.join(arg))
    

This function not only works with coercers, all objects that fulfill needed protocol to get names will also be valid.

class xotl.tools.values.combo(*coercers)[source]

Represent a zip composition of several inner coercers.

An instance of this class is constructed from a sequence of coercers and the its purpose is coerce a sequence of values. Return a sequence [2] where each item contains the i-th element from applying the i-th coercer to the i-th value from argument sequence:

coercers -> (coercer-1, coercer-2, ... )
values -> (value-1, value-2, ... )
combo(coercers)(values) -> (coercer-1(value-1), coercer-2(value-2), ...)

If any value is coerced invalid, the function returns nil and the combo’s instance variable scope receives the duple (failed-value, failed-coercer).

The returned sequence is truncated in length to the length of the shortest sequence (coercers or arguments).

If no coercer is given, all sequences are coerced as empty.

[2]The returned sequence is of the same type as the argument sequence if possible.
class xotl.tools.values.compose(*args, **kwargs)[source]

Returns the composition of several inner coercers.

compose(f1, ... fn) is equivalent to f1(…(fn(arg))…)``.

If no coercer is given return identity_coerce().

Could be considered an “AND” operator with some light differences because the nature of coercers: ordering the coercers is important when some can modify (adapt) original values.

If no value results in coercers, a default coercer could be given as a keyword argument; identity_coerce is assumed if missing.

xotl.tools.values.create_int_range_coerce(min, max)[source]

Create a coercer to check integers between a range.

xotl.tools.values.create_unique_member_coerce(coerce, container)[source]

Useful to wrap member coercers when coercing containers.

See iterable and mapping.

Resulting coercer check that a member must be unique (not repeated) after it’s coerced.

For example:

>>> from xotl.tools.values import (mapping, create_unique_member_coerce,
...                            int_coerce, float_coerce)

>>> sample = {'1': 1, 2.0: '3', 1.0 + 0j: '4.1'}

>>> dc = mapping(int_coerce, float_coerce)
>>> dc(dict(sample))
{1: 1.0, 2: 3.0}

>>> dc = mapping(create_unique_member_coerce(int_coerce), float_coerce)
>>> dc(dict(sample))
nil
class xotl.tools.values.custom(*args, **kwargs)[source]

Base class for any custom coercer.

The field inner stores an internal data used for the custom coercer; could be a callable, an inner coercer, or a tuple of inner checkers if more than one is needed, …

The field scope stores the exit (not regular) condition: the value that fails or -if needed- a tuple with (exit-value, exit-coercer) or (error-value, error). The exit condition is not always a failure, for example in some it is the one that is valid among other inner coercers. To understand better this think on (AND, OR) operators a chain of ANDs exits with the first failure and a chains of ORs exits with the first success.

All custom coercers are callable (must redefine __call__()) receiving one argument that must be coerced. For example:

>>> def foobar(*args):
...     coerce = pargs(int_coerce)
...     return coerce(args)

This class has two protected fields (_str_join and _repr_join) that are used to call coercer_name() in __str__() and __repr__() special methods.

classmethod flatten(obj, avoid=Unset)[source]

Flatten a coercer set.

Parameters:obj – Could be a coercer representing other inner coercers, or a tuple or list containing coercers.
xotl.tools.values.file_coerce(arg)[source]

Check if arg is a file-like object.

xotl.tools.values.float_coerce(arg)[source]

Check if arg is a valid float.

Other types are checked (string, int, complex).

xotl.tools.values.full_identifier_coerce(arg)[source]

Check if arg is a valid dotted Python identifier.

See identifier_coerce() for what “validity” means.

xotl.tools.values.identifier_coerce(arg)[source]

Check if arg is a valid Python identifier.

Note

Only Python 2’s version of valid identifier. This means that some Python 3 valid identifiers are not considered valid. This helps to keep things working the same in Python 2 and 3.

xotl.tools.values.identity_coerce(arg)[source]

Leaves unchanged the passed argument arg.

xotl.tools.values.int_coerce(arg)[source]

Check if arg is a valid integer.

Other types are checked (string, float, complex).

class xotl.tools.values.istype(*args, **kwargs)[source]

Pure type-checker.

It’s constructed from an argument valid for types_tuple_coerce() coercer.

For example:

>>> int_coerce = istype(int)

>>> int_coerce(1)
1

>>> int_coerce('1')
nil

>>> number_coerce = istype((int, float, complex))

>>> number_coerce(1.25)
1.25

>>> number_coerce('1.25')
nil
class xotl.tools.values.iterable(member_coerce, outer_coerce=True)[source]

Create a inner coercer that coerces an iterable member a member.

See constructor for more information.

Return a list, or the same type of source iterable argument if possible.

For example:

>>> from xotl.tools.values import (iterable, int_coerce,
...                            create_unique_member_coerce)

>>> sample = {'1', 1, '1.0'}

>>> sc = iterable(int_coerce)
>>> sc(set(sample)) == {1}
True

See mapping for more details of this problem. The equivalent safe example is:

>>> member_coerce = create_unique_member_coerce(int_coerce, sample)
>>> sc = iterable(member_coerce)
>>> sc(set(sample))
nil

when executed coerces arg (an iterable) member a member using member_coercer. If any member coercion fails, the full execution also fails.

There are three types of results when an instance is executed: (1) iterables that are coerced without modifications, (2) the modified ones but conserving its type, and (3) those that are returned in a list.

class xotl.tools.values.logical(*args, **kwds)[source]

Represent Common Lisp two special values t and nil.

Include redefinition of __call__() to check values with special semantic:

  • When called as t(arg), check if arg is not nil returning a logical true: the same argument if arg is nil or a true boolean value, else return t. That means that False or 0 are valid true values for Common Lisp but not for Python.
  • When called as nil(arg), check if arg is nil returning t or nil if not.

Constructor could receive a valid name (‘nil’ or ‘t’) or any other boolean instance.

class xotl.tools.values.mapping(*args, **kwargs)[source]

Create a coercer to check dictionaries.

Receives two coercers, one for keys and one for values.

For example:

>>> from xotl.tools.values import (mapping, int_coerce, float_coerce,
...                                create_unique_member_coerce)

>>> sample = {'1': 1, 2.0: '3', 1.0 + 0j: '4.1'}

>>> dc = mapping(int_coerce, float_coerce)
>>> dc(dict(sample)) == {1: 1.0, 2: 3.0}
True

When coercing containers it’s probable that members become repeated after coercing them. This could be not desirable (mainly in sets and dictionaries). In those cases use create_unique_member_coerce() to wrap member coercer. For example:

>>> key_coerce = create_unique_member_coerce(int_coerce, sample)
>>> dc = mapping(key_coerce, float_coerce)
>>> dc(dict(sample))
nil

Above problem is because it’s the same integer (same hash) coerced versions of '1' and 1.0+0j.

This problem of objects of different types that have the same hash is a problem to use a example as below:

>>> {1: int, 1.0: float, 1+0j: complex} == {1: complex}
True
xotl.tools.values.names_coerce(arg)[source]

Check arg as a tuple of valid object names (identifiers).

If only one string is given, is returned as the only member of the resulting tuple.

xotl.tools.values.number_coerce(arg)[source]

Check if arg is a valid number (integer or float).

Types that are checked (string, int, float, complex).

class xotl.tools.values.pargs(arg_coerce)[source]

Create a inner coercer that check variable argument passing.

Created coercer closure must always receives an argument that is an valid iterable with all members coerced properly with the argument of this outer creator function.

If the inner closure argument has only a member and this one is not properly coerced but it’s an iterabled with all members that coerced well, this member will be the assumed iterable instead the original argument.

In the following example:

>>> from xotl.tools.values import (iterable, int_coerce)

>>> def foobar(*args):
...     coerce = iterable(int_coerce)
...     return coerce(args)

>>> args = (1, 2.0, '3.0')
>>> foobar(*args)
(1, 2, 3)

>>> foobar(args)
nil

An example using pargs

>>> from xotl.tools.values import (pargs, int_coerce)

>>> def foobar(*args):
...     # Below, "coercer" receives the returned "inner"
...     coerce = pargs(int_coerce)
...     return coerce(args)

>>> args = (1, 2.0, '3.0')
>>> foobar(*args)
(1, 2, 3)

>>> foobar(args)
(1, 2, 3)

The second form is an example of the real utility of this coercer closure: if by error a sequence is passed as it to a function that expect a variable number of argument, this coercer fixes it.

Instance variable scope stores the last processed invalid argument.

When executed, usually arg is a tuple received by a function as *args form.

When executed, returns a tuple, or the same type of source iterable argument if possible.

See xotl.tools.params for a more specialized and full function arguments conformer.

See combo for a combined coercer that validate each member with a separate member coercer.

xotl.tools.values.positive_int_coerce(arg)[source]

Check if arg is a valid positive integer.

class xotl.tools.values.safe(func)[source]

Uses a function (or callable) in a safe way.

Receives a coercer that expects only one argument and returns another value.

If the returned value is a boolean (maybe the coercer is a predicate), it’s converted to a logical instance.

The wrapped coercer is called in a safe way (inside try/except); if an exception is raised the coercer returns nil and the error is saved in the instance attribute scope.

xotl.tools.values.sized_coerce(arg)[source]

Return a valid sized iterable from arg.

If arg is iterable but not sized, is converted to a list. For example:

>>> sized_coerce(i for i in range(1, 10, 2))
[1, 3, 5, 7, 9]

>>> s = {1, 2, 3}
>>> sized_coerce(s) is s
True
class xotl.tools.values.some(*args, **kwargs)[source]

Represent OR composition of several inner coercers.

compose(f1, ... fn) is equivalent to f1(arg) or f2(arg) … fn(arg)`` in the sense “the first not nil”.

If no coercer is given return void_coerce().

xotl.tools.values.type_coerce(arg)[source]

Check if arg is a valid type.

class xotl.tools.values.typecast(*args, **kwargs)[source]

A type-caster.

It’s constructed from an argument valid for types_tuple_coerce() coercer. Similar to istype but try to convert the value if needed.

For example:

>>> int_cast = typecast(int)

>>> int_cast('1')
1

>>> int_cast('1x')
nil
xotl.tools.values.types_tuple_coerce(arg)[source]

Check if arg is valid for isinstance or issubclass 2nd argument.

Type checkers are any class, a type or tuple of types. For example:

>>> types_tuple_coerce(object) == (object,)
True

>>> types_tuple_coerce((int, float)) == (int, float)
true

>>> types_tuple_coerce('not-a-type') is nil
True

See type_coerce for more information.

xotl.tools.values.void_coerce(arg)[source]

Always nil.

Contents:

xotl.tools.values.ids – unique identifiers at different contexts

Utilities to obtain identifiers that are unique at different contexts.

Contexts could be global, host local or application local. All standard uuid tools are included in this one: UUID, uuid1(), uuid3(), uuid4(), uuid5(), getnode() and standard UUIDs constants NAMESPACE_DNS, NAMESPACE_URL, NAMESPACE_OID and NAMESPACE_X500.

This module also contains:

  • str_uuid(): Return a string with a GUID representation, random if the argument is True, or a host ID if not.

New in version 1.7.0.

Deprecated since version 2.1.0.

xotl.tools.values.ids.str_uuid(random=False)[source]

Return a “Global Unique ID” as a string.

Parameters:random – If True, a random uuid is generated (does not use host id).

Deprecated since version 2.1.0: Use uuid.uuid4() or uuid.uuid1().

xotl.tools.values.simple – Simple or internal coercers

Simple or internal coercers.

With coercers defined in this module, many of the xotl.tools.string utilities could be deprecated.

In Python 3, all arrays, not only those containing valid byte or unicode chars, are buffers.

xotl.tools.values.simple.ascii_coerce(arg)[source]

Coerce to string containing only ASCII characters.

Convert all non-ascii to valid characters using unicode ‘NFKC’ normalization.

xotl.tools.values.simple.ascii_set_coerce(arg)[source]

Coerce to string with only ASCII characters removing repetitions.

Convert all non-ascii to valid characters using unicode ‘NFKC’ normalization.

xotl.tools.values.simple.bytes_coerce(arg)[source]

Encode an unicode string (or any object) returning a bytes buffer.

Uses the defined encoding system value.

In Python 2.x bytes coincide with str type, in Python 3 str uses unicode and str is different to bytes.

There are differences if you want to obtain a buffer in Python 2.x and Python 3; for example, the following code obtain different results:

>>> ba = bytes([65, 66, 67])

In Python 2.x is obtained the string "[65, 66, 67]" and in Python 3 b"ABC". This function normalize these differences.

Name is used in named objects, see name_coerce() for more information.

See str_coerce() to coerce to standard string type, bytes in Python 2.x and unicode (str) in Python 3.

Always returns the bytes type.

New in version 1.7.0.

xotl.tools.values.simple.chars_coerce(arg)[source]

Convert to unicode characters.

If arg is an integer between 0 and 0x10ffff is converted assuming it as ordinal unicode code, else is converted with unicode_coerce().

xotl.tools.values.simple.collection(arg=nil, avoid=(), force=False, base=None, name=None)[source]

Coercer for logic collections.

Inner coercer returns the same argument if it is a strict iterable. In Python, strings are normally iterables, but never in our logic. So:

>>> collection('abc') is nil
True

This function could directly check an argument if it isn’t nil, or returns a coercer using extra parameters:

Parameters:
  • avoid

    a type or tuple of extra types to ignore as valid collections; for example:

    >>> collection(avoid=dict)({}) is nil
    True
    >>> collection()({}) is nil
    False
    
  • force

    if main argument is not a valid collection, it is are wrapped inner a list:

    >>> collection(avoid=(dict,), force=True)({}) == [{}]
    True
    
  • base – if not None, must be the base to check instead of Iterable.
  • name – decorate inner coercer with that function name.
xotl.tools.values.simple.decode_coerce(arg)[source]

Decode objects implementing the buffer protocol.

xotl.tools.values.simple.encode_coerce(arg)[source]

Encode string objects.

xotl.tools.values.simple.force_collection_coerce(arg)

Return the same argument if it is a strict iterable. Strings and (<class ‘collections.abc.Mapping’>,) are not considered valid iterables in this case. A non iterable argument is wrapped in a list.

xotl.tools.values.simple.force_iterable_coerce(arg)

Return the same argument if it is a strict iterable. Strings are not considered valid iterables in this case. A non iterable argument is wrapped in a list.

xotl.tools.values.simple.force_sequence_coerce(arg)

Return the same argument if it is a strict iterable. Strings and (<class ‘collections.abc.Mapping’>,) are not considered valid iterables in this case. A non iterable argument is wrapped in a list.

xotl.tools.values.simple.isnot(value)[source]

Create a coercer that returns arg if arg is not value.

xotl.tools.values.simple.iterable_coerce(arg)[source]

Return the same argument if it is an iterable.

xotl.tools.values.simple.logic_collection_coerce(arg)

Return the same argument if it is a strict iterable. Strings and (<class ‘collections.abc.Mapping’>,) are not considered valid iterables in this case.

xotl.tools.values.simple.logic_iterable_coerce(arg)

Return the same argument if it is a strict iterable. Strings are not considered valid iterables in this case.

xotl.tools.values.simple.logic_sequence_coerce(arg)

Return the same argument if it is a strict iterable. Strings and (<class ‘collections.abc.Mapping’>,) are not considered valid iterables in this case.

xotl.tools.values.simple.lower_ascii_coerce(arg)[source]

Coerce to string containing only lower-case ASCII characters.

Convert all non-ascii to valid characters using unicode ‘NFKC’ normalization.

xotl.tools.values.simple.lower_ascii_set_coerce(arg)[source]

Coerce to string with only lower-case ASCII chars removing repetitions.

Convert all non-ascii to valid characters using unicode ‘NFKC’ normalization.

xotl.tools.values.simple.name_coerce(arg)[source]

If arg is a named object, return its name, else nil.

Object names are always of str type, other types are considered invalid.

Generator objects has the special __name__ attribute, but they are ignored and considered invalid.

xotl.tools.values.simple.not_false(default)[source]

Create a coercer that returns default if arg is considered false.

See not_false_coercer() for more information on values considered false.

xotl.tools.values.simple.not_false_coercer(arg)[source]

Validate that arg is not a false value.

Python convention for values considered True or False is not used here, our false values are only None or any false instance of xotl.tools.symbols.boolean (of course including False itself).

xotl.tools.values.simple.str_coerce(arg)[source]

Coerce to standard string type.

bytes in Python 2.x and unicode (str) in Python 3.

New in version 1.7.0.

Deprecated since version 2.0.6.

xotl.tools.values.simple.strict_string_coerce(arg)[source]

Coerce to string only if argument is a valid string type.

class xotl.tools.values.simple.text[source]

Return a nice text representation of one object.

text(obj=’‘) -> text

text(bytes_or_buffer[, encoding[, errors]]) -> text

Create a new string object from the given object. If encoding or errors is specified, then the object must expose a data buffer that will be decoded using the given encoding and error handler. Otherwise, returns the result of object text representation.

Parameters:
  • encoding – defaults to sys.getdefaultencoding().
  • errors – defaults to ‘strict’.

Method join is improved, in order to receive any collection of objects, as variable number of arguments or as one iterable.

chr_join(variable_number_args or iterable) → text[source]

Return a text which is the concatenation of the objects (converted to text) in argument items. The separator between elements is S.

Difference with join() is that integers between 0 and 0x10ffff are converted to characters as unicode ordinal.

join(variable_number_args or iterable) → text[source]

Return a text which is the concatenation of the objects (converted to text) in argument items. The separator between elements is S.

See chr_join() for other vertion of this functionality.

xotl.tools.values.simple.unicode_coerce(arg)[source]

Decode a buffer or any object returning unicode text.

Uses the defined encoding system value.

In Python 2.x unicode has a special type different to str but in Python 3 coincide with str type.

Name is used in named objects, see name_coerce() for more information.

See str_coerce() to coerce to standard string type, bytes in Python 2.x and unicode (str) in Python 3.

New in version 1.7.0.

xotl.tools.web – Utils for Web applications

Utils for Web applications.

xotl.tools.web.slugify(s, entities=True, decimal=True, hexadecimal=True)[source]

Convert a string to a slug representation.

Normalizes string, converts to lower-case, removes non-alpha characters, and converts spaces to hyphens.

Parts from http://www.djangosnippets.org/snippets/369/

>>> slugify("Manuel Vázquez Acosta")    # doctest: +SKIP
'manuel-vazquez-acosta'

If s and entities is True (the default) all HTML entities are replaced by its equivalent character before normalization:

>>> slugify("Manuel V&aacute;zquez Acosta")   
'manuel-vazquez-acosta'

If entities is False, then no HTML-entities substitution is made:

>>> value = "Manuel V&aacute;zquez Acosta"
>>> slugify(value, entities=False)  
'manuel-v-aacute-zquez-acosta'

If decimal is True, then all entities of the form &#nnnn where nnnn is a decimal number deemed as a unicode codepoint, are replaced by the corresponding unicode character:

>>> slugify('Manuel V&#225;zquez Acosta')  
'manuel-vazquez-acosta'

>>> value = 'Manuel V&#225;zquez Acosta'
>>> slugify(value, decimal=False)  
'manuel-v-225-zquez-acosta'

If hexadecimal is True, then all entities of the form &#nnnn where nnnn is a hexdecimal number deemed as a unicode codepoint, are replaced by the corresponding unicode character:

>>> slugify('Manuel V&#x00e1;zquez Acosta')  
'manuel-vazquez-acosta'

>>> slugify('Manuel V&#x00e1;zquez Acosta', hexadecimal=False)  
'manuel-v-x00e1-zquez-acosta'

Deprecated since version 2.1.0: Use xotl.tools.strings.slugify().

Package xoutil

Transition to a new namespace

Since version 2.1, we’re transitioning to another name: xotl.tools. This is to align xoutil as a fundamental part of our family of projects under the xotl namespace. Xotl is a Nahuatl word which may stand for ‘foundation’. xoutil is part of the foundation of many of our projects.

Backwards compatible imports

Since 2.1, every module importable from xoutil is actually under the namespace xotl.tools; so importing, for instance, from xoutil.future.datetime should be updated to xotl.tools.future.datetime.

Importing from xoutil will still be possible in all versions before 3.0. You won’t have to change all your imports right away.

Distribution of xoutil

Will continue to distribute both xotl.tools and xoutil (with the same codebase) for the entire 2.1.x series. From version 2.2.0+ will distruibute only xotl.tools, but keep the backwards import up to 3.0.

Warning

Don’t depend on both xoutil and xotl.tools. We use the same codebase for both distributions; which means you’ll get the same code, but if you install different versions you may get a crippled system.

Changelog

2.1 series

Unreleased. Release 2.1.10
  • Improve type hints for several modules.

    We run mypy 0.782 in a large project of ours that uses many modules of xotl.tools and we discovered no major roadblocks. So we think this deserves its own release.

    The list of modules we deem are complete:

  • Add official support for Python 3.9. We now our test

2020-07-05. Release 2.1.9
  • Add official support for Python 3.8 and drop support for Python 3.5.

    Even though our packages have been tested with Python 3.8 for a while. This release marks the official support.

    Dropping support for Python 3.5 means we no longer going to test our changes in Python 3.5.

  • Backport module graphlib from Python 3.9 in xotl.tools.future.graphlib. Refer to the standard documentation.

  • Add function xotl.tools.future.itertools.zip_map().

2020-03-31. Release 2.1.8
  • Make xotl.tools.infinity.Infinity comparable with unknown types by returning NotImplemented instead of raising the error directly. See MR !28 for an example.
2020-03-10. Release 2.1.7
2020-01-21. Release 2.1.6
  • Deprecate xotl.tools.future.itertools.first_n() in favor of stdlib’s itertools.islice().
2019-12-12. Release 2.1.5
  • Deprecate iter_final_subclasses() and get_final_subclasses().
  • Stop support for Python 3.4.
  • Fix bug #6: Instances of boolean were not pickable.
2019-10-26. Release 2.1.4
2019-05-26. Release 2.1.3
2019-05-26. Release 2.1.2
2019-03-13. Release 2.1.1
  • Fix packaging issue. No functional changes.
2019-02-27. Release 2.1.0
  • Repackage xoutil under xotl.tools. You can still import from the xoutil namespace.
  • Remove deprecated module xoutil.logger.
  • Remove package xoutil.eight.
  • Remove deprecated xoutil.decorator.memoized_property, use xotl.tools.objects.memoized_property.
  • Remove deprecated functions and classes xoutil.future.inspect.type_name, xoutil.future.functools.ctuple, and xoutil.future.functools.compose.
  • Remove deprecated top-level imports: xoutil.Unset, xoutil.Undefined, xoutil.Ignored and xoutil.Invalid.
  • Add xotl.tools.deprecation.deprecated_alias().
  • Allow to customize Quantity in xotl.tools.dim.meta.Dimension and, by argument, in new().
  • Deprecate xoutil.future.itertools.zip(), and xoutil.future.itertools.map().
  • Re-implement xotl.tools.future.itertools.merge() in terms of heapq.merge().
  • Add xotl.tools.tasking.get_backoff_wait()
  • Add xotl.tools.objects.iter_final_subclasses(), xotl.tools.objects.get_final_subclasses() and xotl.tools.objects.FinalSubclassEnumeration().
  • Deprecate module xotl.tools.progress.
  • Deprecate module xotl.tools.values.ids.
  • Deprecate xotl.tools.web.slugify; use xotl.tools.strings.slugify() instead.
  • Remove deprecated module xotl.tools.uuid.
  • Remove deprecated module xotl.tool.logical.
  • Remove deprecated module xotl.tools.formatter.
  • Remove deprecated function xotl.tools.tools.get_default.

2.0 series

Note

End-of-life for xoutil 2.0

xoutil 2.0.7 will be the last release in the xoutil 2.0.x series that adds new functionality. Any future release in this series will be bug-fix only.

Since the pair-wise releases of 1.9.x and 2.0.x some new functionality has been added to some version of 1.9.x that is not present in some releases of the 2.0.x series.

This created some dose of unease for users wanting a new feature in 1.9.3 in a package where Python 2/3 was not a true concern; they were forced to require ‘xoutil>=1.9.3,!=2.0.0,!=2.0.1,!=2.0.2’ to avoid the package manager to select a version without the needed feature.

This end-of-life notice puts an end to this issue.

2018-11-07. Release 2.0.9
2018-09-24. Release 2.0.8
  • Incorporates all (applicable) changes from release 1.9.8
  • Fix bug when comparing version numbers (xoutil.versions).
2018-09-14. Release 2.0.7

Incorporates all (applicable) changes from release 1.9.7

2018-07-30. Release 2.0.6

Incorporates all (applicable) changes from release 1.9.6:

2018-06-25. Release 2.0.5

Incorporates all (applicable) changes from release 1.9.5:

2018-05-09. Release 2.0.4.1
  • Packaging fix: the python tag for releases in the 2.0 branch was incorrectly set to “py2”. xoutil 2.0+ support only Python 3.4+.

    We’re removing the wrongly tagged releases from PyPI.

2018-05-08. Release 2.0.4

Incorporates all (applicable) changes from release 1.9.4:

2018-04-16. Release 2.0.3
2018-03-30. Release 2.0.2
2018-03-22. Release 2.0.1
2018-03-02. Release 2.0.0
  • This is the first release for which Python 2 no longer supported. It was a good time! Good bye, Python 2!

  • The following imports are no longer available. Look for them in xoutil.future:

    • xoutil.collection
    • xoutil.datetime
    • xoutil.functools
    • xoutil.inspect
    • xoutil.iterators
    • xoutil.json
    • xoutil.pprint
    • xoutil.subprocess
    • xoutil.textwrap
    • xoutil.threading
    • xoutil.types
  • Deprecate modules that only provided a unifying mechanism between Python 2 and 3, or that backported features from Python 3 to Python 2:

    • xoutil.annotate
    • xoutil.eight
    • xoutil.eight.urllib
  • Remove deprecated module xoutil.html.

1.9 series

Note

End-of-life for xoutil 1.9

xoutil 1.9.7 will be the last release of xoutil that adds functionality. Future releases will be strictly bug-fix only.

2018-11-07. Release 1.9.9
  • Fix bug #4: xoutil.decorator.meta.flat_decorator() was not working in Python 3.
  • Deprecate xoutil.decorator.meta.flat_decorator().
2018-09-24. Release 1.9.8
  • Fix bug in xoutil.cli for Python 3.7.
2018-09-14. Release 1.9.7
  • Add support for Python 3.7.
  • xoutil.eight.abc.ABC is an alias to the stdlib’s ABC class if using Python 3.4+.
  • Rename xoutil.fp.iterators.iter_compose to xoutil.fp.iterators.kleisli_compose(). Leave iter_compose as deprecated alias.
  • Add xoutil.future.datetime.TimeSpan.diff()
  • Add xoutil.future.datetime.DateTimeSpan.
2018-07-30. Release 1.9.6
  • Add parameter ‘encoding’ to xoutil.eight.string.force() and xoutil.eight.string.safe_join().
  • Add xoutil.fp.iterators and xoutil.fp.iterators.iter_compose().
2018-06-25. Release 1.9.5
  • Add module xoutil.future.contextlib.
2018-05-08. Release 1.9.4
  • Fix xoutil.eight.iteritems(), xoutil.eight.itervalues() and xoutil.eight.iterkeys() to return an iterator.
  • is_valid_identifier() so that it uses str.isidentifier() in Python 3.
  • Add class method xoutil.future.collections.opendict.from_enum()
2018-04-16. Release 1.9.3
  • Make TimeSpan intersection inversible. Before, doing date.today() & TimeSpan() raised a TypeError, but swapping the operands worked. Now, both ways work.
  • Add xoutil.objects.delegator() and xoutil.objects.DelegatedAttribute.
2018-03-30. Release 1.9.2
  • xoutil.context.NullContext is now a Mapping.
2018-03-22. Release 1.9.1
  • Fix bug #29: Issues with xoutil.symbols.symbol documentation and implementation.
  • Fix bug #30: It was possible to define a dimension with two (or more) incompatible canonical units.
  • Fix bug #33: Reusing a context leaves the context unusable.
  • Renamed xoutil.tasking.StandardWait to xoutil.tasking.ConstantWait.
2018-03-02. Release 1.9.0
  • With the release of 2.0.0, xoutil ends it support for Python 2.

    Releases 1.9 are a continuation of the 1.8 series and don’t break any API found in the last release of that series: 1.8.8.

  • Add xoutil.objects.import_object().

  • Add xoutil.context.Context.from_defaults() and xoutil.context.Context.from_dicts().

  • Deprecate imports from top-level xoutil. The following objects should be imported from xoutil.symbols:

    • Unset
    • Undefined
    • Invalid
    • Ignored

1.8 series

2018-02-24. Release 1.8.8
  • Fix bug #28: xoutil.future.inspect.getattr_static() failed with Python’s 2 old classes.
2018-01-06. Release 1.8.7
  • Add parameter ‘encoding’ to slugify() and force_ascii(). (bug #25).
  • Stop using locale.getpreferredencoding() in force_encoding(). Also related to bug #25.
2018-01-02. Release 1.8.6
2017-12-22. Release 1.8.5
  • Deprecate module xoutil.logger.
  • Remove deprecated function xoutil.iterators.fake_dict_iteritems.
  • Add xoutil.objects.temp_attributes().
  • Add functions fst() and snd().
2017-12-15. Release 1.8.4
  • Add module xoutil.future.csv.
  • Add module xoutil.future.mimetypes.
  • Add module xoutil.eight.urllib.
  • The module xoutil.iterators is now officially named xoutil.future.itertools. The alias xoutil.iterators remains as a deprecated alias.
  • Add xoutil.future.itertools.merge().
  • Add xoutil.future.types._get_mro_attr() function.
  • Deprecate in xoutil.future.types module: mro_dict class; and the functions mro_get_value_list, mro_get_full_mapping, is_iterable, is_collection, is_mapping, is_string_like, is_scalar, is_module, are_instances, and no_instances.
2017-11-28. Release 1.8.3
  • Fix bug #20: xoutil.future.calendar may fail at import time.
  • Add xoutil.params.pop_keyword_values().
  • Add xoutil.future.collections.codedict. Deprecate module xoutil.formatter.
2017-11-22. Release 1.8.2
  • Add displacement operations left shift (<<) and right shift (>>) for TimeSpan.
  • Add xoutil.objects.smart_getter() and xoutil.objects.save_attributes().
  • Document experimental module xoutil.tasking.
  • Add extra (and experimental) module xoutil.testing.
2017-11-17. Release 1.8.1
  • Remove deprecated xoutil.objects.get_and_del_first_of(), xoutil.objects.smart_getattr(), and xoutil.objects.get_and_del_attr().
  • Remove deprecated arguments from xoutil.objects.xdir() and xoutil.objects.fdir().
  • Fix bug #17: xoutil.fp.tools.compose is not wrappable.
  • Move xoutil.decorator.memoized_property to xoutil.objects.memoized_property module. Deprecate the first.
  • Deprecate xoutil.decorator.memoized_instancemethod.
  • Deprecate xoutil.decorator.reset_memoized(). Use reset().
  • Fix bug (unregistered): xoutil.objects.traverse() ignores its getter.
2017-11-03. Release 1.8.0
  • Remove deprecated xoutil.objects.metaclass, use xoutil.eight.meta.metaclass() instead.

  • Several modules are migrated to xoutil.future:

    • types.
    • collections.
    • datetime.
    • functools.
    • inspect.
    • codecs.
    • json.
    • threading.
    • subprocess.
    • pprint.
    • textwrap.

    Note

    All modules remain importable from its future-less version, however, deprecated.

  • Add function xoutil.deprecation.import_deprecated(), inject_deprecated() can be deprecated now.

  • Add function xoutil.deprecation.deprecate_linked() to deprecate full modules imported from a linked version. The main example are all sub-modules of xoutil.future.

  • Add function xoutil.deprecation.deprecate_module() to deprecate full modules when imported.

  • The module xoutil.string suffered a major reorganization due to ambiguity use of Strings in Python.

  • Create __crop__ protocol for small string representations, see xoutil.clipping.crop() for more information.

    Because clipping module is still experimental, definitive names of operator and main function must be validated before it could be considered definitive. Proposals are: “crop”, “small”, “short”, “compact”, “abbr”.

  • Remove xoutil.connote that was introduced provisionally in 1.7.1.

  • Module xoutil.params was introduced provisionally in 1.7.1, but now has been fully recovered.

    • Add function issue_9137() – Helper to fix issue 9137 (self ambiguity).
    • Add function check_count() – Checker for positional arguments actual count against constrains.
    • Add function check_default() – Default value getter when passed as a last excess positional argument.
    • Add function single() – Return true only when a unique argument is given.
    • Add function xoutil.params.keywords_only – Decorator to make a function to accepts its keywords arguments as keywords-only.
    • Add function pop_keyword_arg() – Tool to get a value from keyword arguments using several possible names.
    • Add class ParamManager – Parameter manager in a “smart” way.
    • Add class ParamScheme – Parameter scheme definition for a manager.
    • Add class ParamSchemeRow – Parameter scheme complement.
    • Remove xoutil.params.ParamConformer.
  • Module xoutil.values was recovered adding several new features (old name xoutil.cl was deprecated).

  • Add experimental module xoutil.fp for Functional Programming stuffs.

  • Add experimental module xoutil.tasking.

  • Add xoutil.symbols. It replaces xoutil.logical that was introduced in 1.7.0, but never documented.

  • Remove deprecated module xoutil.data. Add xoutil.objects.adapt_exception().

  • Remove deprecated xoutil.dim.meta.Signature.isunit().

1.7 series

2017-10-31. Release 1.7.12
  • xoutil.datetime.EmptyTimeSpan is now pickable.
2017-10-05. 1.7.11
  • Fix bug #9: TimeSpans are not hashable.
2017-09-21. 1.7.10
  • Fix bug #6: TimeSpan.overlaps was incorrectly defined.
  • Fix bug #5: TimeSpan can’t have a union method.
2017-09-20. 1.7.9
  • Deprecate xoutil.dim.meta.Signature.isunit().
  • Rename xoutil.dim.meta.QuantityType to xoutil.dim.meta.Dimension.
  • Fix bug in xoutil.datetime.TimeSpan. start_date and end_date now return an instance of Python’s datetime.date instead of a sub-class.
2017-09-19. 1.7.8
  • Added module xoutil.dim – Facilities to work with concrete numbers.
2017-09-07. 1.7.7
  • Fixed bug in xoutil.datetime.date that prevented to use strftime() in subclasses.
  • Fixed bug in xoutil.datetime.TimeSpan.valid().
2017-09-05. Release 1.7.6
  • Fix a bug in xoutil.datetime.TimeSpan for Python 2. Representing a time span might fail with a ‘Maximum Recursion Detected’ error.
2017-09-05. Release 1.7.5
  • Added xoutil.datetime.TimeSpan.
  • Added the module xoutil.infinity.
  • Added the keyword argument on_error to xoutil.bound.until_errors().
2017-04-06. Release 1.7.4
  • Added the argument key to xoutil.iterators.delete_duplicates().
  • Added the function xoutil.iterators.iter_delete_duplicates().
2017-02-23. Release 1.7.3
  • Add xoutil.iterators.ungroup().
  • Add xoutil.future.datetime.get_next_month().
2017-02-07. Release 1.7.2
  • Add xoutil.bound.until() and xoutil.bound.until_errors().
  • Fix issue that made xoutil.uuid unusable. Introduced in version 1.7.1, commit 58eb359.
  • Remove support for Python 3.1 and Python 3.2.
2015-12-17. Release 1.7.1
  • Add xoutil.collections.PascalSet and xoutil.collections.BitPascalSet.
  • Add xoutil.functools.lwraps().
  • Add xoutil.objects.multi_getter(), xoutil.objects.get_branch_subclasses(), xoutil.objects.fix_method_documentation().
  • Add xoutil.string.safe_str()
  • Remove long deprecated modules: xoutil.aop and xoutil.proxy.
  • Deprecate xoutil.html entirely.
  • The following modules are included on a provisional basis. Backwards incompatible changes (up to and including removal of the module) may occur if deemed necessary by the core developers:
    • xoutil.connote.
    • xoutil.params.

Fixes in 1.7.1.post1:

  • Fix issue with both xoutil.string.safe_decode() and xoutil.string.safe_encode().

    Previously, the parameter encoding could contain an invalid encoding name and the function could fail.

Fixes in 1.7.1.post2:

  • Fix xoutil.string.cut_suffix(). The following invariant was being violated:

    >>> cut_suffix(v, '') == v  # for any value of 'v'
    

Warning

Due to lack of time, we have decided to release this version without proper releases of 1.7.0 and 1.6.11.

Unreleased. Release 1.7.0

This release was mainly focused in providing a new starting point for several other changes. This release is being synchronized with the last release of the 1.6.11 to allow deprecation messages to be included properly.

The following is the list of changes:

  • The defaults xoutil.objects.smart_copy() has being made keyword only.
  • Deprecates the pop() semantics, they shadow the dict.pop(). A new pop_level() is provided to explicitly pop a stack level. The same is done for the pop() method.
  • Deprecates function xoutil.iterators.fake_dict_iteritems.
  • Deprecates xoutil.objects.metaclass in favor for xoutil.eight.meta.metaclass().

1.6 series

Unreleased. Release 1.6.11

This is the last release of the 1.6 series. It’s being synchronized with release 1.7.0 to deprecate here what’s being changed there.

  • The defaults argument of xoutil.objects.smart_copy() is marked to be keyword-only in version 1.7.0.
  • Fixes a bug in xoutil.objects.smart_copy(). If defaults was None is was not being treated the same as being False, as documented. This bug was also fixed in version 1.7.0.
  • xoutil.objects.metaclass() will be moved to xoutil.eight.meta in version 1.7.0 and deprecated, it will be removed from xoutil.object in version 1.7.1.
  • This release will be the last to support Python 3.1, 3.2 and 3.3. Support will be kept for Python 2.7 and Python 3.4.
2015-04-15. Release 1.6.10
  • Fix repr() and str() issues with xoutil.cli.Command instances.
2015-04-03. Release 1.6.9
  • The defaults argument in xoutil.objects.smart_copy() is now keyword-only.
  • xoutil.context is now greenlet-safe without depending of gevent.
2015-01-26. Release 1.6.8
  • Added xoutil.records.date_reader().
  • Added a forward-compatible xoutil.inspect.getfullargspec.
  • Now contexts will support gevent-locals if available. See the note in the module documentation.
  • Minor fixes.
2014-12-17. Release 1.6.7
  • Added the strict argument to xoutil.records.datetime_reader().

  • You may now install xoutil[extra] so that not required but useful packages are installed when xoutil is installed.

    For now this only includes python-dateutil that allows the change in datetime_reader().

2014-11-26. Release 1.6.6
  • Improved the xoutil.string.normalize_slug() by providing both valid and invalid chars.
  • Added the xoutil.string.normalize_ascii().
2014-10-13. Release 1.6.5
  • Added the module xoutil.records.

  • Deleted deprecated xoutil.compat.

  • Deprecate the xoutil.six. It will removed in 1.7.0 (probably next release).

    Now xoutil requires six 1.8.0.

2014-09-13. Release 1.6.4
  • Fix bug in xoutil.fs.concatfiles(): There were leaked opened files.
2014-08-05. Release 1.6.3
  • Added the pre-release version of xoutil.bound module.
2014-08-04. Release 1.6.2
  • Fix encoding issues in xoutil.string.cut_prefix() and xoutil.string.cut_suffix().

    Previously this code failed:

    >>> from xoutil.string import cut_prefix
    >>> cut_prefix(u'-\xe1', '-')
    Traceback ...
      ...
    UnicodeEncodeError: 'ascii' ...
    

    Now both functions force its second argument to be of the same type of the first. See xoutil.string.safe_decode() and xoutil.string.safe_encode().

2014-07-18. Release 1.6.1
  • Added the yield parameter in xoutil.fs.ensure_filename().
  • Added the base parameter in xoutil.modules.moduleproperty().
  • Added the function xoutil.fs.concatfiles().
2014-06-02. Release 1.6.0
  • Changes the signature of xoutil.names.nameof(), also the semantics of the full parameter is improved.

    This is the major change in this release. Actually, this release has being prepared in sync with the release 1.5.6 (just a few days ago) to have this change passed while still keeping our versions scheme.

1.5 series

2014-05-29. Release 1.5.6
  • Warn about a future backwards incompatible change in the behavior of xoutil.names.nameof().
2014-05-13. Release 1.5.5
  • UserList are now collections in the sense of xoutil.types.is_collection().

  • Python 3.4 added to the list of tested Python environments. Notice this does not makes any warrants about identical behavior of things that were previously backported from Python 3.3.

    For instance, the xoutil.collections.ChainMap has been already backported from Python 3.4, so it will have the same signature and behavior across all supported Python versions.

    But other new things in Python 3.4 are not yet backported to xoutil.

  • Now xoutil.objects.metaclass() supports the __prepare__ classmethod of metaclasses. This is fully supported in Python 3.0+ and partially mocked in Python 2.7.

  • Backported xoutil.types.MappingProxyType from Python 3.3.

  • Backported xoutil.types.SimpleNamespace from Python 3.4.

  • Backported xoutil.types.DynamicClassAttribute from Python 3.4

  • Added function xoutil.iterators.delete_duplicates().

  • Added parameter ignore_underscore to xoutil.string.normalize_slug().

  • Added module xoutil.crypto with a function for generating passwords.

  • Fixed several bug in xoutil.functools.compose().

  • Makes xoutil.fs.path.rtrim() have a default value for the amount of step to traverse.

2014-04-08. Release 1.5.4
  • Fix a bug in xoutil.objects.extract_attrs(). It was not raising exceptions when some attribute was not found and default was not provided.

    Also now the function supports paths like xoutil.objects.get_traverser().

  • xoutil contains now a copy of the excellent project six exported as xoutil.six (not documented here). Thus the compatibility module xoutil.compat is now deprecated and will removed in the future.

    There are some things that xoutil.compat has that xoutil.six does not. For instance, six does not include fine grained python version markers. So if your code depends not on Python 3 v Python 2 dichotomy but on features introduced in Python 3.2 you must use the sys.version_info directly.

    Notwithstanding that, xoutil will slowly backport several Python 3.3 standard library features to Python 2.7 so that they are consistently used in any Python up to 2.7 (but 3.0).

2014-04-01. Release 1.5.3
  • Now xoutil supports Python 2.7, and 3.1+. Python 3.0 was not tested.

  • Added a strict parameter to xoutil.objects.smart_getter().

  • New function xoutil.objects.get_traverser().

  • The function xoutil.cli.app.main() prefers its default parameter instead of the application’s default command.

    Allow the xoutil.cli.Command to define a command_cli_name to change the name of the command. See xoutil.cli.tools.command_name().

2014-03-03. Release 1.5.2
  • Deprecated function xoutil.objects.get_and_del_key(). Use the dict.pop() directly.

    To have consistent naming, renamed get_and_del_attr() and get_and_del_first_of() to popattr() and pop_first_of(). Old names are left as deprecated aliases.

  • Now xoutil.functools.update_wrapper(), xoutil.functools.wraps() and xoutil.functools.lru_cache() are Python 3.3 backports (or aliases).

  • New module xoutil.textwrap.

2014-02-14. Release 1.5.1
  • Added functions xoutil.objects.dict_merge(), xoutil.types.are_instances() and xoutil.types.no_instances().
  • Deprecated function xoutil.objects.smart_getattr(). Use xoutil.objects.get_first_of() instead.
2014-01-24. Release 1.5.0
  • Lots of removals. Practically all deprecated since 1.4.0 (or before). Let’s list a few but not all:
    • Both xoutil.Unset and xoutil.Ignored are no longer re-exported in xoutil.types.
    • Removes module xoutil.decorator.compat, since it only contained the deprecated decorator xoutil.decorator.compat.metaclass() in favor of xoutil.objects.metaclass().
    • Removes nameof and full_nameof from xoutil.objects in favor of xoutil.names.nameof().
    • Removes pow_ alias of xoutil.functools.power().
    • Removes the deprecated xoutil.decorator.decorator function. Use xoutil.decorator.meta.decorator() instead.
    • Now get_module_path() is documented and in module xoutil.modules.
  • Also we have documented a few more functions, including xoutil.fs.path.rtrim().
  • All modules below xoutil.aop are in risk and are being deprecated.

1.4 series

  • Adds xoutil.datetime.daterange().

  • Adds xoutil.objects.traverse().

  • Adds xoutil.fs.makedirs() and xoutil.fs.ensure_filename().

  • The fill argument in function xoutil.iterators.slides() now defaults to None. This is consistent with the intended usage of Unset and with the semantics of both xoutil.iterators.continuously_slides() and xoutil.iterators.first_n().

    Unset, as a default value for parameters, is meant to signify the absence of an argument and thus only would be valid if an absent argument had some kind of effect different from passing the argument.

  • Changes xoutil.modules.customize() API to separate options from custom attributes.

  • Includes a random parameter to xoutil.uuid.uuid().

  • Deprecations and introductions:
    • Importing xoutil.Unset and xoutil.Ignored from xoutil.types now issues a warning.
    • New style for declaring portable metaclasses in xoutil.objects.metaclass(), so xoutil.decorator.compat.metaclass() is now deprecated.
    • Adds the module xoutil.pprint and function xoutil.pprint.ppformat().
    • Adds the first version of package xoutil.cli.
    • Adds the filter parameter to functions xoutil.objects.xdir() and xoutil.objects.fdir() and deprecates attr_filter and value_filter.
    • Adds functions xoutil.objects.attrclass(), xoutil.objects.fulldir().
    • Adds function xoutil.iterators.continuously_slides().
    • Adds package xoutil.threading.
    • Adds package xoutil.html module and begins the port of xoutil.html.parser from Python 3.3 to xoutil, so that a common implementation for both Python 2.7 and Python 3.3 is available.
  • Bug fixes:
    • Fixes some errors with classical AOP weaving of functions in modules that where customized.
    • Fixes bugs with xoutil.modules: makes xoutil.modules.modulemethod() to customize the module, and improves performance.
2013-04-26. Release 1.4.0
  • Refactors xoutil.types as explained in types-140-refactor.
  • Changes involving xoutil.collections:
    • Moves SmartDict and SortedSmartDict from xoutil.data to xoutil.collections. They are still accessible from xoutil.data.
    • Also there is now a xoutil.collections.SmartDictMixin that implements the update behind all smart dicts in xoutil.
    • xoutil.collections.StackedDict in now a SmartDict and thus gains zero-level initialization data.
  • Removals of deprecated, poorly tested, or incomplete features:
    • Removes deprecated xoutil.decorators. Use xoutil.decorator.
    • Removed xoutil.iterators.first(), and xoutil.iterators.get_first().
    • Removed xoutil.string.names(), xoutil.string.normalize_to_str() and xoutil.string.normalize_str_collection().
  • Newly deprecated functions:
    • Deprecates xoutil.iterators.obtain().
    • Deprecates xoutil.iterators.smart_dict() and xoutil.data.smart_copy in favor of xoutil.objects.smart_copy().
  • New features:
    • Introduces xoutil.iterators.first_non_null().
    • Adds xoutil.objects.copy_class() and updates xoutil.decorator.compat.metaclass() to use it.
  • Fixes a bug with xoutil.deprecation.deprecated() when used with classes: It changed the hierarchy and provoked infinite recursion in methods that use super.

1.3 series

  • Removes deprecated module xoutil.mdeco.

  • xoutil.context.Context now inherit from the newly created stacked dict class xoutil.collections.StackedDict. Whenever you enter a context a new level of the stacked dict is pushed, when you leave the context a level is <xoutil.collections.StackedDict.pop>`:meth:.

    This also removes the data attribute execution context used to have, and, therefore, this is an incompatible change.

  • Introduces xoutil.collections.OpenDictMixin and xoutil.collections.StackedDict.

  • Fixes a bug in xoutil.decorator.compat.metaclass(): Slots were not properly handed.

  • Fixes a bug with the simple xoutil.collections.opendict that allowed to shadow methods (even __getitem__) thus making the dict unusable.

1.2 series

2013-04-03. Release 1.2.3
  • Bug fixes in xoutil.proxy and xoutil.aop.classical.
2013-03-25. Release 1.2.2
  • Adds xoutil.bases - Implementations of base 32 and base 64 (numeric) representations.
2013-02-14. Release 1.2.1
  • Loads of improvements for Python 3k compatibility: Several modules were fixed or adapted to work on both Python 2.7 and Python 3.2. They include (but we might have forgotten some):
    • xoutil.context.
    • xoutil.aop.basic.
    • xoutil.deprecation.
    • xoutil.proxy.
  • Rescued xoutil.annotate and is going to be supported from now on.
  • Introduced module xoutil.subprocess and function xoutil.subprocess.call_and_check_output().
  • Introduced module xoutil.decorator.compat that enables constructions that are interoperable in Python 2 and Python 3.
  • Introduced xoutil.iterators.zip(), xoutil.iterators.izip(), xoutil.iterators.map(), and xoutil.iterators.imap().
2013-01-04. Release 1.2.0

This is the first of the 1.2.0 series. It’s been given a bump in the minor version number because we’ve removed some deprecated functions and/or modules.

  • Several enhancements to xoutil.string to make it work on Python 2.7 and Python 3.2.

    Deprecates xoutil.string.normalize_to_str() in favor of the newly created xoutil.string.force_str() which is Python 3 friendly.

  • Backwards incompatible changes in xoutil.objects API. For instance, replaces getattr parameter with getter in xoutil.objects.xdir() and co.

  • Extracts decorator-making facilities from xoutil.decorators into xoutil.mdeco.

  • Fixes in xoutil.aop.extended. Added parameters in xoutil.aop.classical.weave().

  • Introduces xoutil.iterators.first_n() and deprecates xoutil.iterators.first() and xoutil.iterators.get_first().

  • Removes the zope.interface awareness from xoutil.context since it contained a very hard to catch bug. Furthermore, this was included to help the implementation of xotl.ql, and it’s no longer used there.

    This breaks version control policy since it was not deprecated beforehand, but we feel it’s needed to avoid spreading this bug.

  • Removed long-standing deprecated modules xoutil.default_dict, xoutil.memoize and xoutil.opendict.

  • Fixes bug in xoutil.datetime.strfdelta(). It used to show things like ‘1h 62min’.

  • Introduces xoutil.compat.class_type that holds class types for Python 2 or Python 3.

1.1 series

2012-11-01. Release 1.1.4
  • Introduces xoutil.compat.iteritems_(), xoutil.compat.iterkeys_() and xoutil.compat.itervalues_().
  • execution context are now aware of zope.interface interfaces; so that you may ask for a context name implementing a given interface, instead of the name itself.
  • Improves xoutil.formatter documentation.
  • Several fixes to xoutil.aop.classical. It has sudden backwards incompatible changes.
  • before and after methods may use the *args, **kwargs idiom to get the passed arguments of the weaved method.
  • Several minor fixes: Invalid warning about Unset not in xoutil.types
2012-08-22. Release 1.1.3
  • Adds function xoutil.fs.rmdirs() that removes empty dirs.
  • Adds functions xoutil.string.safe_join(), xoutil.string.safe_encode(), xoutil.string.safe_decode(), and xoutil.string.safe_strip(); and the class xoutil.string.SafeFormatter.
  • Adds function xoutil.cpystack.iter_frames().
2012-07-11. Release 1.1.2
  • Fixes all copyrights notices and chooses the PSF License for Python 3.2.3 as the license model for xoutil releases.
  • All releases from now on will be publicly available at github.
2012-07-06. Release 1.1.1
  • Improves deprecation warnings by pointing to the real calling filename
  • Removes all internal use of simple_memoize since it’s deprecated. We now use lru_cache().
2012-07-03. Release 1.1.0
  • Created the whole documentation Sphinx directory.
  • Removed xoutil.future since it was not properly tested.
  • Removed xoutil.annotate, since it’s not portable across Python’s VMs.
  • Introduced module xoutil.collections
  • Deprecated modules xoutil.default_dict, xoutil.opendict in favor of xoutil.collections.
  • Backported xoutil.functools.lru_cache() from Python 3.2.
  • Deprecated module xoutil.memoize in favor of xoutil.functools.lru_cache().

1.0 series

2012-06-15. Release 1.0.30
  • Introduces a new module :py`xoutil.proxy`:mod:.
  • Starts working on the sphinx documentation so that we move to 1.1 release we a decent documentation.
2012-06-01. Release 1.0.29.
  • Introduces xoutil.iterators.slides and xoutil.aop.basic.contextualized
2012-05-28. Release 1.0.28.
  • Fixes normalize path and other details
  • Makes validate_attrs to work with mappings as well as objects
  • Improves complementors to use classes as a special case of sources
  • Simplifies importing of legacy modules
  • PEP8
2012-05-22. Release 1.0.27.
  • Removes bugs that were not checked (tested) in the previous release.
2012-05-21. Release 1.0.26.
  • Changes in AOP classic. Now you have to rename after, before and around methods to _after, _before and _around.

    It is expected that the signature of those methods change in the future.

  • Introducing a default argument for xoutil.objects.get_first_of().

  • Other minor additions in the code. Refactoring and the like.

2012-04-30. Release 1.0.25.
  • Extends the classical AOP approach to modules. Implements an extended version with hooks.
  • 1.0.25.1: Makes classical/extended AOP more reliable to TypeError’s in getattr. xoonko, may raise TypeError’s for TranslatableFields.

2012-04-27. Release 1.0.24.

  • Introduces a classical AOP implementation: xoutil.aop.classical.
2012-04-10. Release 1.0.23.
  • Introduces decorators: xoutil.decorators.instantiate and xoutil.aop.complementor
2012-04-05. Release 1.0.22
  • Allows annotation’s expressions to use defined local variables. Before this release the following code raised an error:

    >>> from xoutil.annotate import annotate
    >>> x1 = 1
    >>> @annotation('(a: x1)')
    ... def dummy():
    ...     pass
    Traceback (most recent call last):
       ...
    NameError: global name 'x1' is not defined
    
  • Fixes decorators to allow args-less decorators

2012-04-03. Release 1.0.21
  • Includes a new module xoutil.annotate that provides a way to place Python annotations in forward-compatible way.

How to contribute to xotl.tools

Testing

Running tests

Quick:

pipenv install --dev
tox
Writing tests

Testing was not introduced in xotl.tools until late in the project life. So there are many modules that lack a proper test suite.

To ease the task of writing tests, we chose pytest.

We use both normal tests (“à la pytest”) and doctest. The purpose of doctests is testing the documentation instead of testing the code, which is the purpose of the former.

Most of our normal tests are currently simple functions with the “test” prefix and are located in the tests/ directory.

Many functions that lacks are, though, tested by our use in other projects. However, it won’t hurt if we write them.

Documentation

Since xotl.tools is collection of very disparate stuff, the documentation is hardly narrative but is contained in the docstrings of every “exported” element, except perhaps for module-level documentation in some cases. In these later cases, a more narrative text is placed in the .rst file that documents the module.

Versioning and deprecation

xoutil uses three version components.

The first number refers to language compatibility: xoutil 1.x series are devoted to keeping compatible versions of the code for both Python 2.7 and Python 3.2+. The jump to 2.x version series will made when xoutil won’t support Python 2.7 any longer.

From version 2.1.0, we renamed the package to xotl.tools but we keep imports up to version 3.0, and distribution of xoutil up to version 2.2.0.

The second number is library major version indicator. This indicates, that some deprecated stuff are finally removed and/or new functionality is provided.

The third number is minor release number. Devoted to indicate mostly fixes to existing functionality. Though many times, some functions are merged and the old ones get a deprecation warning.

Occasionally, a fourth component is added to a release. This usually means a packaging problem, or bug in the documentation.

Module layout and rules

Many modules in xotl.tools contains definitions used in xotl.tools itself. Though we try to logically place every feature into a rightful, logical module; sometimes this is not possible because it would lead to import dependency cycles.

We are establishing several rules to keep our module layout and dependency quite stable while, at the same time, allowing developers to use almost every feature in xoutil.

We divide xoutil modules into 4 tiers:

  1. Tier 0

    This tier groups the modules that must not depend from other modules besides the standard library. These modules implement some features that are exported through other xoutil modules. These module are never documented, but their re-exported features are documented elsewhere.

  2. Tier 1

    In this tier we have:

    • xotl.tools.decorator.meta. This is to allow the definition of decorators in other modules.
    • xotl.tools.names. This is to allow the use of xotl.tools.names.namelist for the __all__ attribute of other modules.
    • xotl.tools.deprecation. It must not depend on any other module. Many modules in xotl.tools will use this module at import time to declare deprecated features.
  3. Tier 2

    Modules in this tier should depend only on features defined in tiers 0 and 1 modules, and that export features that could be imported at the module level.

    This tier only has the xotl.tools.modules. Both xotl.tools.modules.modulepropery() and xotl.tools.modules.modulemethod() are meant be used at module level definitions, so they are likely to be imported at module level.

  4. Tier 3

    The rest of the modules.

    In this tier, xotl.tools.objects is king. But in order to allow the import of other modules the following pair of rules are placed:

  • At the module level only import from upper tiers.
  • Imports from tier 3 are allowed, but only inside the functions that use them.

This entails that you can’t define a function that must be a module level import, like a decorator for other functions. For that reason, decorators are mostly placed in the xotl.tools.decorator module.

The tiers above are a “logical suggestion” of how xoutil modules are organized and indicated how they might evolve.

[1]See definitive list of needed Python interpreters in tox.ini file.

List of contributors

If you’re a contributor and you’re not listed here, we appologize for that omission, and ask you to add yourself to the list.

  • Medardo Rodríguez started this package and wrote most of it.
  • Dunia Trujillo has fixed bugs, tested the software and also contributed code.
  • Manuel Vázquez has contribute code and reorganize the package for the 1.1.x release series. He has contributed also to the documentation and docstring in reST format with doctests.

Indices and tables